Oct 01 13:05:38 crc systemd[1]: Starting Kubernetes Kubelet... Oct 01 13:05:38 crc restorecon[4666]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:38 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:05:39 crc restorecon[4666]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 13:05:39 crc restorecon[4666]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 01 13:05:40 crc kubenswrapper[4749]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 13:05:40 crc kubenswrapper[4749]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 01 13:05:40 crc kubenswrapper[4749]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 13:05:40 crc kubenswrapper[4749]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 13:05:40 crc kubenswrapper[4749]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 01 13:05:40 crc kubenswrapper[4749]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.793618 4749 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811295 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811341 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811352 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811361 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811368 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811376 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811384 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811392 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811399 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811421 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811431 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811441 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811449 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811456 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811464 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811471 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811480 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811488 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811495 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811503 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811510 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811518 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811526 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811533 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811540 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811548 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811556 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811564 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811572 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811579 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811587 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811594 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811602 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811610 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811618 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811625 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811633 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811641 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811648 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811656 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811663 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811671 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811685 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811696 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811705 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811713 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811721 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811730 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811739 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811747 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811756 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811764 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811772 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811782 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811792 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811801 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811810 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811818 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811827 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811835 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811843 4749 feature_gate.go:330] unrecognized feature gate: Example Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811851 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811859 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811867 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811875 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811883 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811891 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811899 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811908 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811919 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.811929 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812115 4749 flags.go:64] FLAG: --address="0.0.0.0" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812134 4749 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812147 4749 flags.go:64] FLAG: --anonymous-auth="true" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812160 4749 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812174 4749 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812183 4749 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812195 4749 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812206 4749 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812249 4749 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812272 4749 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812286 4749 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812298 4749 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812310 4749 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812320 4749 flags.go:64] FLAG: --cgroup-root="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812329 4749 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812339 4749 flags.go:64] FLAG: --client-ca-file="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812347 4749 flags.go:64] FLAG: --cloud-config="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812356 4749 flags.go:64] FLAG: --cloud-provider="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812365 4749 flags.go:64] FLAG: --cluster-dns="[]" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812376 4749 flags.go:64] FLAG: --cluster-domain="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812385 4749 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812394 4749 flags.go:64] FLAG: --config-dir="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812403 4749 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812413 4749 flags.go:64] FLAG: --container-log-max-files="5" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812424 4749 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812433 4749 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812443 4749 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812452 4749 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812461 4749 flags.go:64] FLAG: --contention-profiling="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812470 4749 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812479 4749 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812489 4749 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812498 4749 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812508 4749 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812518 4749 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812528 4749 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812537 4749 flags.go:64] FLAG: --enable-load-reader="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812546 4749 flags.go:64] FLAG: --enable-server="true" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812555 4749 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812566 4749 flags.go:64] FLAG: --event-burst="100" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812575 4749 flags.go:64] FLAG: --event-qps="50" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812584 4749 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812592 4749 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812601 4749 flags.go:64] FLAG: --eviction-hard="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812612 4749 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812622 4749 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812631 4749 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812640 4749 flags.go:64] FLAG: --eviction-soft="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812649 4749 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812658 4749 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812667 4749 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812675 4749 flags.go:64] FLAG: --experimental-mounter-path="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812684 4749 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812693 4749 flags.go:64] FLAG: --fail-swap-on="true" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812701 4749 flags.go:64] FLAG: --feature-gates="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812713 4749 flags.go:64] FLAG: --file-check-frequency="20s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812722 4749 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812732 4749 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812741 4749 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812750 4749 flags.go:64] FLAG: --healthz-port="10248" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812759 4749 flags.go:64] FLAG: --help="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812768 4749 flags.go:64] FLAG: --hostname-override="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812776 4749 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812785 4749 flags.go:64] FLAG: --http-check-frequency="20s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812794 4749 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812804 4749 flags.go:64] FLAG: --image-credential-provider-config="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812812 4749 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812821 4749 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812830 4749 flags.go:64] FLAG: --image-service-endpoint="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812839 4749 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812848 4749 flags.go:64] FLAG: --kube-api-burst="100" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812857 4749 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812867 4749 flags.go:64] FLAG: --kube-api-qps="50" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812875 4749 flags.go:64] FLAG: --kube-reserved="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812884 4749 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812893 4749 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812902 4749 flags.go:64] FLAG: --kubelet-cgroups="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812911 4749 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812919 4749 flags.go:64] FLAG: --lock-file="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812928 4749 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812937 4749 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812947 4749 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812960 4749 flags.go:64] FLAG: --log-json-split-stream="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812969 4749 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812977 4749 flags.go:64] FLAG: --log-text-split-stream="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812986 4749 flags.go:64] FLAG: --logging-format="text" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.812995 4749 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813005 4749 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813013 4749 flags.go:64] FLAG: --manifest-url="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813022 4749 flags.go:64] FLAG: --manifest-url-header="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813034 4749 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813044 4749 flags.go:64] FLAG: --max-open-files="1000000" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813054 4749 flags.go:64] FLAG: --max-pods="110" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813063 4749 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813072 4749 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813081 4749 flags.go:64] FLAG: --memory-manager-policy="None" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813090 4749 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813099 4749 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813108 4749 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813117 4749 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813140 4749 flags.go:64] FLAG: --node-status-max-images="50" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813149 4749 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813158 4749 flags.go:64] FLAG: --oom-score-adj="-999" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813168 4749 flags.go:64] FLAG: --pod-cidr="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813176 4749 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813190 4749 flags.go:64] FLAG: --pod-manifest-path="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813198 4749 flags.go:64] FLAG: --pod-max-pids="-1" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813208 4749 flags.go:64] FLAG: --pods-per-core="0" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813246 4749 flags.go:64] FLAG: --port="10250" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813258 4749 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813269 4749 flags.go:64] FLAG: --provider-id="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813280 4749 flags.go:64] FLAG: --qos-reserved="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813291 4749 flags.go:64] FLAG: --read-only-port="10255" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813300 4749 flags.go:64] FLAG: --register-node="true" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813309 4749 flags.go:64] FLAG: --register-schedulable="true" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813318 4749 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813333 4749 flags.go:64] FLAG: --registry-burst="10" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813344 4749 flags.go:64] FLAG: --registry-qps="5" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813353 4749 flags.go:64] FLAG: --reserved-cpus="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813361 4749 flags.go:64] FLAG: --reserved-memory="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813372 4749 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813381 4749 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813390 4749 flags.go:64] FLAG: --rotate-certificates="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813399 4749 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813408 4749 flags.go:64] FLAG: --runonce="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813416 4749 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813425 4749 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813435 4749 flags.go:64] FLAG: --seccomp-default="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813444 4749 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813453 4749 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813462 4749 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813471 4749 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813481 4749 flags.go:64] FLAG: --storage-driver-password="root" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813490 4749 flags.go:64] FLAG: --storage-driver-secure="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813500 4749 flags.go:64] FLAG: --storage-driver-table="stats" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813509 4749 flags.go:64] FLAG: --storage-driver-user="root" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813518 4749 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813527 4749 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813536 4749 flags.go:64] FLAG: --system-cgroups="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813545 4749 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813559 4749 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813567 4749 flags.go:64] FLAG: --tls-cert-file="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813576 4749 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813587 4749 flags.go:64] FLAG: --tls-min-version="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813596 4749 flags.go:64] FLAG: --tls-private-key-file="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813605 4749 flags.go:64] FLAG: --topology-manager-policy="none" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813614 4749 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813623 4749 flags.go:64] FLAG: --topology-manager-scope="container" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813632 4749 flags.go:64] FLAG: --v="2" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813643 4749 flags.go:64] FLAG: --version="false" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813654 4749 flags.go:64] FLAG: --vmodule="" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813664 4749 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.813674 4749 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813874 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813886 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813897 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813905 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813914 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813922 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813931 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813945 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813954 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813962 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813970 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813980 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813988 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.813996 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814004 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814012 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814020 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814029 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814038 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814046 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814054 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814062 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814070 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814077 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814085 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814092 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814100 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814108 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814115 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814123 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814130 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814138 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814146 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814154 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814161 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814169 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814178 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814186 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814195 4749 feature_gate.go:330] unrecognized feature gate: Example Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814203 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814211 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814245 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814254 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814269 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814277 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814285 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814293 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814300 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814308 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814316 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814323 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814331 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814339 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814347 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814355 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814362 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814372 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814382 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814392 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814400 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814408 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814416 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814424 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814432 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814439 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814447 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814454 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814464 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814493 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814502 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.814510 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.815293 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.830511 4749 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.830558 4749 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830647 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830657 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830664 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830669 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830674 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830679 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830684 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830689 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830693 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830698 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830702 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830706 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830711 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830715 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830719 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830724 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830728 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830732 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830737 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830741 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830746 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830750 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830754 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830761 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830768 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830781 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830788 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830793 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830797 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830803 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830807 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830811 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830817 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830822 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830827 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830833 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830840 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830847 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830852 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830857 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830862 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830867 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830872 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830877 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830881 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830885 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830890 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830894 4749 feature_gate.go:330] unrecognized feature gate: Example Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830898 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830903 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830908 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830912 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830917 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830921 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830927 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830931 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830937 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830943 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830949 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830955 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830960 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830964 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830969 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830973 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830978 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830983 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830988 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830992 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.830996 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831001 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831005 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.831013 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831172 4749 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831183 4749 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831190 4749 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831196 4749 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831202 4749 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831206 4749 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831211 4749 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831233 4749 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831237 4749 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831242 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831248 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831253 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831258 4749 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831263 4749 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831268 4749 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831273 4749 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831278 4749 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831282 4749 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831287 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831291 4749 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831296 4749 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831301 4749 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831305 4749 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831310 4749 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831314 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831320 4749 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831325 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831331 4749 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831336 4749 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831341 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831345 4749 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831350 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831354 4749 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831359 4749 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831364 4749 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831369 4749 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831373 4749 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831378 4749 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831383 4749 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831388 4749 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831392 4749 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831397 4749 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831403 4749 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831407 4749 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831412 4749 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831416 4749 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831421 4749 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831425 4749 feature_gate.go:330] unrecognized feature gate: Example Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831430 4749 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831434 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831438 4749 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831443 4749 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831447 4749 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831454 4749 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831460 4749 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831465 4749 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831469 4749 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831474 4749 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831478 4749 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831490 4749 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831495 4749 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831499 4749 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831504 4749 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831508 4749 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831513 4749 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831517 4749 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831522 4749 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831526 4749 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831530 4749 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831534 4749 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 13:05:40 crc kubenswrapper[4749]: W1001 13:05:40.831539 4749 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.831547 4749 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.831764 4749 server.go:940] "Client rotation is on, will bootstrap in background" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.838474 4749 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.838609 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.849451 4749 server.go:997] "Starting client certificate rotation" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.849519 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.853200 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-10 20:10:45.627127075 +0000 UTC Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.853391 4749 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1687h5m4.77374205s for next certificate rotation Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.924427 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.926622 4749 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 13:05:40 crc kubenswrapper[4749]: I1001 13:05:40.950657 4749 log.go:25] "Validated CRI v1 runtime API" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.009663 4749 log.go:25] "Validated CRI v1 image API" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.012235 4749 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.022169 4749 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-01-13-00-32-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.022267 4749 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.054409 4749 manager.go:217] Machine: {Timestamp:2025-10-01 13:05:41.050261108 +0000 UTC m=+1.104246027 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a6812da7-649f-44b0-a677-765984715a01 BootID:0a611ef5-4004-4e1a-b60a-007b3d7463fd Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e5:bf:7f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:e5:bf:7f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:1b:22:68 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:7c:85:ff Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:49:69:29 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:dd:c9:0b Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4e:4b:d9:08:40:ab Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1e:ff:51:f4:42:2a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.054851 4749 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.055096 4749 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.055640 4749 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.055948 4749 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.056007 4749 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.056482 4749 topology_manager.go:138] "Creating topology manager with none policy" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.056500 4749 container_manager_linux.go:303] "Creating device plugin manager" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.057303 4749 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.057382 4749 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.058410 4749 state_mem.go:36] "Initialized new in-memory state store" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.058595 4749 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.069636 4749 kubelet.go:418] "Attempting to sync node with API server" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.069694 4749 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.069744 4749 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.069768 4749 kubelet.go:324] "Adding apiserver pod source" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.069790 4749 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.077522 4749 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.079053 4749 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 01 13:05:41 crc kubenswrapper[4749]: W1001 13:05:41.082035 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.082104 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:41 crc kubenswrapper[4749]: W1001 13:05:41.082159 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.082332 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.101693 4749 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.103989 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.104034 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.104054 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.104074 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.104103 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.104122 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.104139 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.104168 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.104188 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.104211 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.104295 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.104316 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.104359 4749 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.105196 4749 server.go:1280] "Started kubelet" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.105789 4749 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.105779 4749 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.106593 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.106745 4749 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 01 13:05:41 crc systemd[1]: Started Kubernetes Kubelet. Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.108054 4749 server.go:460] "Adding debug handlers to kubelet server" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.110324 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.110413 4749 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.110483 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 23:50:57.954198878 +0000 UTC Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.110523 4749 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1234h45m16.84367789s for next certificate rotation Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.110560 4749 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.110568 4749 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.110600 4749 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.110838 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 13:05:41 crc kubenswrapper[4749]: W1001 13:05:41.111255 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.121522 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.122909 4749 factory.go:55] Registering systemd factory Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.122964 4749 factory.go:221] Registration of the systemd container factory successfully Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.123367 4749 factory.go:153] Registering CRI-O factory Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.123493 4749 factory.go:221] Registration of the crio container factory successfully Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.123380 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.220:6443: connect: connection refused" interval="200ms" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.123873 4749 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.124454 4749 factory.go:103] Registering Raw factory Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.124569 4749 manager.go:1196] Started watching for new ooms in manager Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.125626 4749 manager.go:319] Starting recovery of all containers Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.173794 4749 manager.go:324] Recovery completed Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174714 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174776 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174790 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174801 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174809 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174819 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174830 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174841 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174853 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174863 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174873 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174882 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174894 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174904 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174914 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174922 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174931 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174963 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174976 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174986 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.174995 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175004 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175015 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175025 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175035 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175045 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.145942 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.220:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a5fcd9a69a76e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-01 13:05:41.105149806 +0000 UTC m=+1.159134745,LastTimestamp:2025-10-01 13:05:41.105149806 +0000 UTC m=+1.159134745,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175654 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175673 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175685 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175696 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175707 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175719 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175731 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175742 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175760 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175768 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175778 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175788 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175798 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175809 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175819 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175829 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175837 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175864 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175875 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175887 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175896 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175906 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175915 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175925 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175936 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175949 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175963 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175975 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175985 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.175995 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176006 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176016 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176026 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176036 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176045 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176053 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176062 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176072 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176079 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176090 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176100 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176109 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176118 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176127 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176135 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176143 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176153 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176162 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176171 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176180 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176189 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176197 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176206 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176214 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176236 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176245 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176256 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176265 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176274 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.176283 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179345 4749 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179372 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179385 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179397 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179407 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179419 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179430 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179464 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179476 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179487 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179497 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179506 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179516 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179525 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179535 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179543 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.179570 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180045 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180063 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180080 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180091 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180122 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180154 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180173 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180368 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180395 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180409 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180421 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180433 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180446 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180460 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180472 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180485 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180496 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180509 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180519 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180527 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180536 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180545 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180556 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180568 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180581 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180593 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180608 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180620 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180631 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180643 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180655 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180665 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180676 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180685 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180693 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180701 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180710 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180718 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180727 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180735 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180743 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180751 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180760 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180770 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180786 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180795 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180803 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180812 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180820 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180829 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180838 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180846 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180853 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180861 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180870 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180879 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180888 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180896 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180905 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180914 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180923 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180932 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180940 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180949 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180957 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180965 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180974 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180982 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180990 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.180998 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181009 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181021 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181030 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181038 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181048 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181057 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181686 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181763 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181789 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181811 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181831 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181859 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181879 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181900 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181919 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181939 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181962 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.181981 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182000 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182019 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182038 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182059 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182078 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182098 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182119 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182141 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182159 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182177 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182196 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182242 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182261 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182281 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182338 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182366 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182390 4749 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182414 4749 reconstruct.go:97] "Volume reconstruction finished" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.182435 4749 reconciler.go:26] "Reconciler: start to sync state" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.191742 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.193808 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.193851 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.193864 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.195011 4749 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.195042 4749 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.195084 4749 state_mem.go:36] "Initialized new in-memory state store" Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.210964 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.226919 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.228483 4749 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.228562 4749 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.228610 4749 kubelet.go:2335] "Starting kubelet main sync loop" Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.228809 4749 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 01 13:05:41 crc kubenswrapper[4749]: W1001 13:05:41.229416 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.229488 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.282603 4749 policy_none.go:49] "None policy: Start" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.284140 4749 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.284198 4749 state_mem.go:35] "Initializing new in-memory state store" Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.311081 4749 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.324485 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.220:6443: connect: connection refused" interval="400ms" Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.328919 4749 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.342728 4749 manager.go:334] "Starting Device Plugin manager" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.342942 4749 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.342962 4749 server.go:79] "Starting device plugin registration server" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.343464 4749 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.343492 4749 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.343713 4749 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.343835 4749 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.343857 4749 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.351798 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.444251 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.445935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.445991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.446011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.446046 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.446771 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.220:6443: connect: connection refused" node="crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.529432 4749 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.529566 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.531266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.531319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.531342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.531549 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.532431 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.532479 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.533709 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.533867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.533912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.533929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.534491 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.534552 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.534793 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.535079 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.535165 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.538363 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.538407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.538423 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.538644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.538681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.538697 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.538855 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.539111 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.539187 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.541653 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.541701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.541718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.542029 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.542200 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.542286 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.548844 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.548858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.548916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.548939 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.548948 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.548970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.549062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.549127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.549149 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.549526 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.549575 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.551127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.551193 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.551248 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.647083 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.649004 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.649083 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.649113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.649161 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.649935 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.220:6443: connect: connection refused" node="crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688070 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688139 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688198 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688243 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688423 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688455 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688488 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688546 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688612 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.688660 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.726184 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.220:6443: connect: connection refused" interval="800ms" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789487 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789586 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789606 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789645 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789662 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789700 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789731 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789783 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789784 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789874 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789708 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789899 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789958 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.789985 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.790036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.790076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.790062 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.790105 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.790135 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.790168 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.790172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.790269 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.891353 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.903321 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.924155 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.951646 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: W1001 13:05:41.961986 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-209923f13098dcd9ec8e0f7ad16f405fa42f9c39feb5f72086e089d44300e017 WatchSource:0}: Error finding container 209923f13098dcd9ec8e0f7ad16f405fa42f9c39feb5f72086e089d44300e017: Status 404 returned error can't find the container with id 209923f13098dcd9ec8e0f7ad16f405fa42f9c39feb5f72086e089d44300e017 Oct 01 13:05:41 crc kubenswrapper[4749]: I1001 13:05:41.962317 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:05:41 crc kubenswrapper[4749]: W1001 13:05:41.963864 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-f2e107f09fa6b01ec40d7f08e0dbf13a2a5d7501c107957d4c074cabc793f4ee WatchSource:0}: Error finding container f2e107f09fa6b01ec40d7f08e0dbf13a2a5d7501c107957d4c074cabc793f4ee: Status 404 returned error can't find the container with id f2e107f09fa6b01ec40d7f08e0dbf13a2a5d7501c107957d4c074cabc793f4ee Oct 01 13:05:41 crc kubenswrapper[4749]: W1001 13:05:41.970876 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-0b07af7c32d2f3844d9c3a27717d5b9a09381d2b3364e0cd7f60b1a2dca52494 WatchSource:0}: Error finding container 0b07af7c32d2f3844d9c3a27717d5b9a09381d2b3364e0cd7f60b1a2dca52494: Status 404 returned error can't find the container with id 0b07af7c32d2f3844d9c3a27717d5b9a09381d2b3364e0cd7f60b1a2dca52494 Oct 01 13:05:41 crc kubenswrapper[4749]: W1001 13:05:41.982504 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-adff0b259fb5069099a4c6af04d6e24140c8f4309258195b3d9fc07ee1e87b31 WatchSource:0}: Error finding container adff0b259fb5069099a4c6af04d6e24140c8f4309258195b3d9fc07ee1e87b31: Status 404 returned error can't find the container with id adff0b259fb5069099a4c6af04d6e24140c8f4309258195b3d9fc07ee1e87b31 Oct 01 13:05:41 crc kubenswrapper[4749]: W1001 13:05:41.984803 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-b75e6b53d4e8f00e7130e6e34c662d9c5bd1b9d7918c3727c59de0e489624351 WatchSource:0}: Error finding container b75e6b53d4e8f00e7130e6e34c662d9c5bd1b9d7918c3727c59de0e489624351: Status 404 returned error can't find the container with id b75e6b53d4e8f00e7130e6e34c662d9c5bd1b9d7918c3727c59de0e489624351 Oct 01 13:05:41 crc kubenswrapper[4749]: W1001 13:05:41.998719 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:41 crc kubenswrapper[4749]: E1001 13:05:41.998903 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.051074 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.053414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.053478 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.053504 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.053545 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:05:42 crc kubenswrapper[4749]: E1001 13:05:42.054210 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.220:6443: connect: connection refused" node="crc" Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.107633 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.235684 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0b07af7c32d2f3844d9c3a27717d5b9a09381d2b3364e0cd7f60b1a2dca52494"} Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.237312 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"f2e107f09fa6b01ec40d7f08e0dbf13a2a5d7501c107957d4c074cabc793f4ee"} Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.239169 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"209923f13098dcd9ec8e0f7ad16f405fa42f9c39feb5f72086e089d44300e017"} Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.240980 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b75e6b53d4e8f00e7130e6e34c662d9c5bd1b9d7918c3727c59de0e489624351"} Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.242841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"adff0b259fb5069099a4c6af04d6e24140c8f4309258195b3d9fc07ee1e87b31"} Oct 01 13:05:42 crc kubenswrapper[4749]: W1001 13:05:42.260289 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:42 crc kubenswrapper[4749]: E1001 13:05:42.260444 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:42 crc kubenswrapper[4749]: W1001 13:05:42.314148 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:42 crc kubenswrapper[4749]: E1001 13:05:42.314305 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:42 crc kubenswrapper[4749]: W1001 13:05:42.375819 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:42 crc kubenswrapper[4749]: E1001 13:05:42.375959 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:42 crc kubenswrapper[4749]: E1001 13:05:42.527725 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.220:6443: connect: connection refused" interval="1.6s" Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.855310 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.856905 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.856977 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.857002 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:42 crc kubenswrapper[4749]: I1001 13:05:42.857045 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:05:42 crc kubenswrapper[4749]: E1001 13:05:42.857648 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.220:6443: connect: connection refused" node="crc" Oct 01 13:05:43 crc kubenswrapper[4749]: I1001 13:05:43.107853 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:43 crc kubenswrapper[4749]: E1001 13:05:43.867038 4749 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.220:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a5fcd9a69a76e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-01 13:05:41.105149806 +0000 UTC m=+1.159134745,LastTimestamp:2025-10-01 13:05:41.105149806 +0000 UTC m=+1.159134745,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 01 13:05:44 crc kubenswrapper[4749]: W1001 13:05:44.064846 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:44 crc kubenswrapper[4749]: E1001 13:05:44.064904 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.108413 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:44 crc kubenswrapper[4749]: E1001 13:05:44.129073 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.220:6443: connect: connection refused" interval="3.2s" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.250323 4749 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5c13525193c9f225f9007db3deffb8b23206b884913a990448e6f181c358f597" exitCode=0 Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.250456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5c13525193c9f225f9007db3deffb8b23206b884913a990448e6f181c358f597"} Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.250470 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.251573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.251611 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.251623 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.252154 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0" exitCode=0 Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.252294 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.252285 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0"} Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.253348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.253412 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.253430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.254457 4749 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61" exitCode=0 Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.254508 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61"} Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.254588 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.255780 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.255805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.255816 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.257795 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517" exitCode=0 Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.257877 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517"} Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.258021 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.259355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.259382 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.259395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.260539 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c"} Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.260570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18"} Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.260585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6"} Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.261448 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.263302 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.263351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.263365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:44 crc kubenswrapper[4749]: W1001 13:05:44.308704 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:44 crc kubenswrapper[4749]: E1001 13:05:44.308884 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.458159 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.460371 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.460428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.460443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:44 crc kubenswrapper[4749]: I1001 13:05:44.460487 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:05:44 crc kubenswrapper[4749]: E1001 13:05:44.461764 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.220:6443: connect: connection refused" node="crc" Oct 01 13:05:44 crc kubenswrapper[4749]: W1001 13:05:44.637875 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:44 crc kubenswrapper[4749]: E1001 13:05:44.638034 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:45 crc kubenswrapper[4749]: W1001 13:05:45.001936 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:45 crc kubenswrapper[4749]: E1001 13:05:45.002044 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.108420 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.267202 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"68424d824195c50677ecfbd88a505009615f95e86db6474fa65639cc3e84f86a"} Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.267279 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"915a776839bc9d041901e3a0e0748d62400bb5a2937a9d081abda2b510d5fc21"} Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.272130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130"} Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.272173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392"} Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.278075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2"} Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.278115 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.280498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.280565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.280586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.284564 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cc0d9e3e9caa733b93ea461ffc45e5f91d2c5c1b968bbe533c87e851217d0db5"} Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.284703 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.287383 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.287436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.287458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.289143 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86" exitCode=0 Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.289271 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86"} Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.289310 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.290843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.290884 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:45 crc kubenswrapper[4749]: I1001 13:05:45.290902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.107906 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.295541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd"} Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.295596 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33"} Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.297342 4749 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda" exitCode=0 Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.297389 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda"} Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.297465 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.298839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.298879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.298897 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.301714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b0ec5d59eb3ea896a6763b20ded1ab1de5dfc99d8416f4011b68058f8552faef"} Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.301776 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.301793 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.301737 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.303365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.303375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.303425 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.303451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.303387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.303511 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.303526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.303394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:46 crc kubenswrapper[4749]: I1001 13:05:46.303603 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.107840 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.310133 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2"} Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.310197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a"} Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.316439 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"82197fb0b337dd1947b9c6679cffe3e4529098e2e164cb72d9df51f69f0b9b32"} Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.316536 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.316558 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.316536 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.322182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.322253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.322272 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.323169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.323206 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.323247 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:47 crc kubenswrapper[4749]: E1001 13:05:47.329654 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.220:6443: connect: connection refused" interval="6.4s" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.661908 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.663727 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.664086 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.664104 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.664139 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:05:47 crc kubenswrapper[4749]: E1001 13:05:47.664692 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.220:6443: connect: connection refused" node="crc" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.677308 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:47 crc kubenswrapper[4749]: W1001 13:05:47.792967 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:47 crc kubenswrapper[4749]: E1001 13:05:47.793090 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.991451 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.991692 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Oct 01 13:05:47 crc kubenswrapper[4749]: I1001 13:05:47.991768 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.107773 4749 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.327805 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.331542 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="82197fb0b337dd1947b9c6679cffe3e4529098e2e164cb72d9df51f69f0b9b32" exitCode=255 Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.331601 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"82197fb0b337dd1947b9c6679cffe3e4529098e2e164cb72d9df51f69f0b9b32"} Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.331684 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.332895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.332945 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.332964 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.333947 4749 scope.go:117] "RemoveContainer" containerID="82197fb0b337dd1947b9c6679cffe3e4529098e2e164cb72d9df51f69f0b9b32" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.336998 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651"} Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.337160 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.338290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.338394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.338492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:48 crc kubenswrapper[4749]: W1001 13:05:48.401439 4749 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.220:6443: connect: connection refused Oct 01 13:05:48 crc kubenswrapper[4749]: E1001 13:05:48.401551 4749 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.220:6443: connect: connection refused" logger="UnhandledError" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.992177 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.992404 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.993712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.993784 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:48 crc kubenswrapper[4749]: I1001 13:05:48.993795 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.182789 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.307713 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.343019 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.346419 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98"} Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.346641 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.346839 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.347642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.347668 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.347680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.351181 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623"} Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.351275 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.351810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.351834 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.351843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:49 crc kubenswrapper[4749]: I1001 13:05:49.784575 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.361084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670"} Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.361192 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.361325 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.361387 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.362654 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.365733 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.365772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.365808 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.365830 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.365854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.365739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.365853 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.365904 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:50 crc kubenswrapper[4749]: I1001 13:05:50.365945 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:51 crc kubenswrapper[4749]: E1001 13:05:51.352522 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.363419 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.364205 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.364427 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.365324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.365381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.365406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.366006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.366017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.366316 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.366337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.366276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.366472 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:51 crc kubenswrapper[4749]: I1001 13:05:51.519274 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 01 13:05:52 crc kubenswrapper[4749]: I1001 13:05:52.366170 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:52 crc kubenswrapper[4749]: I1001 13:05:52.367576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:52 crc kubenswrapper[4749]: I1001 13:05:52.367623 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:52 crc kubenswrapper[4749]: I1001 13:05:52.367640 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:53 crc kubenswrapper[4749]: I1001 13:05:53.918745 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:53 crc kubenswrapper[4749]: I1001 13:05:53.918909 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:53 crc kubenswrapper[4749]: I1001 13:05:53.920478 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:53 crc kubenswrapper[4749]: I1001 13:05:53.920536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:53 crc kubenswrapper[4749]: I1001 13:05:53.920559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:53 crc kubenswrapper[4749]: I1001 13:05:53.925168 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:05:54 crc kubenswrapper[4749]: I1001 13:05:54.065481 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:54 crc kubenswrapper[4749]: I1001 13:05:54.067445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:54 crc kubenswrapper[4749]: I1001 13:05:54.067508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:54 crc kubenswrapper[4749]: I1001 13:05:54.067529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:54 crc kubenswrapper[4749]: I1001 13:05:54.067570 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:05:54 crc kubenswrapper[4749]: I1001 13:05:54.372201 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:54 crc kubenswrapper[4749]: I1001 13:05:54.373407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:54 crc kubenswrapper[4749]: I1001 13:05:54.373468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:54 crc kubenswrapper[4749]: I1001 13:05:54.373486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:55 crc kubenswrapper[4749]: I1001 13:05:55.736377 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 01 13:05:55 crc kubenswrapper[4749]: I1001 13:05:55.736745 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:05:55 crc kubenswrapper[4749]: I1001 13:05:55.738455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:05:55 crc kubenswrapper[4749]: I1001 13:05:55.738557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:05:55 crc kubenswrapper[4749]: I1001 13:05:55.738580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:05:56 crc kubenswrapper[4749]: I1001 13:05:56.919435 4749 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 13:05:56 crc kubenswrapper[4749]: I1001 13:05:56.920359 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 13:05:58 crc kubenswrapper[4749]: I1001 13:05:58.252283 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 01 13:05:58 crc kubenswrapper[4749]: I1001 13:05:58.252395 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 01 13:05:59 crc kubenswrapper[4749]: I1001 13:05:59.060103 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 13:05:59 crc kubenswrapper[4749]: I1001 13:05:59.060201 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 13:05:59 crc kubenswrapper[4749]: I1001 13:05:59.064438 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 13:05:59 crc kubenswrapper[4749]: I1001 13:05:59.064501 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 13:06:01 crc kubenswrapper[4749]: E1001 13:06:01.352652 4749 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.002028 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.002318 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.002828 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.002939 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.004122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.004181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.004202 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.008112 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.401069 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.401754 4749 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.401844 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.402483 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.402538 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:03 crc kubenswrapper[4749]: I1001 13:06:03.402555 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.035276 4749 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="7s" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.041686 4749 trace.go:236] Trace[702455870]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 13:05:49.501) (total time: 14540ms): Oct 01 13:06:04 crc kubenswrapper[4749]: Trace[702455870]: ---"Objects listed" error: 14540ms (13:06:04.041) Oct 01 13:06:04 crc kubenswrapper[4749]: Trace[702455870]: [14.54042759s] [14.54042759s] END Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.041738 4749 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.041787 4749 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.042140 4749 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.047795 4749 trace.go:236] Trace[1835898878]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 13:05:50.802) (total time: 13245ms): Oct 01 13:06:04 crc kubenswrapper[4749]: Trace[1835898878]: ---"Objects listed" error: 13245ms (13:06:04.047) Oct 01 13:06:04 crc kubenswrapper[4749]: Trace[1835898878]: [13.24535558s] [13.24535558s] END Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.047838 4749 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.049385 4749 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.080612 4749 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.088896 4749 apiserver.go:52] "Watching apiserver" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.093885 4749 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.094372 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.094909 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.095020 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.095168 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.095635 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.095721 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.095929 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.096044 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.096092 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.096525 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.098257 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.098747 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.098994 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.099371 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.099708 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.099791 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.099927 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.101154 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.101166 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.111720 4749 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142404 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142507 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142548 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142581 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142643 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142712 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142747 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142781 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142818 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142853 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142888 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142917 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142948 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.142978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143016 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143047 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143078 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143143 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143176 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143210 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143265 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143301 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143346 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143378 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143412 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143447 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143506 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143550 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143638 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143692 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143736 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143781 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143818 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143853 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143888 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.143998 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144033 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144066 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144100 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144136 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144169 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144204 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144265 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144339 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144374 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144386 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144409 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144551 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144601 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144639 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144673 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.144717 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145015 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145060 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145112 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145153 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145193 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145262 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145304 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145343 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145379 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145413 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145447 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145487 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145521 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145503 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145567 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145605 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145642 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145676 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145708 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145775 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145808 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145841 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145874 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145908 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145940 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145975 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146006 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146041 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146077 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146109 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146139 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146171 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146203 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146262 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146325 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146359 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146400 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146433 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146464 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146497 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146527 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146561 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146595 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146634 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146669 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146701 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146736 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146766 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146796 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146854 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146887 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146918 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146952 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146987 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147022 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147053 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147087 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147121 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147152 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147241 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147278 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147311 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147342 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147376 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147405 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147495 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147528 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147564 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147596 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147630 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147662 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147695 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147728 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147762 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147806 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147846 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147893 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147933 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147975 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148087 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148132 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148170 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148207 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148546 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148588 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148625 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148673 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148710 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148747 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148785 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148820 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148858 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148895 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148931 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148971 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149055 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149092 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149130 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149167 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149203 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149323 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149364 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149406 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149442 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149478 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149516 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149557 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149597 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149638 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149681 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149724 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149761 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149800 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149835 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149877 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149918 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149957 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149997 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150036 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150074 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150114 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150155 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150192 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150354 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150392 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150428 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150555 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150597 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150639 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150678 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150717 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150799 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.155724 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156580 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156674 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156721 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156764 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156805 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156856 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156930 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.157013 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.157053 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.157190 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.157240 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.157261 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145562 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145823 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.145866 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146015 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.159041 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.159302 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.159336 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146039 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146198 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146200 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146266 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.160860 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.161402 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.161788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.162091 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.162154 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.168508 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.169019 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.169115 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.169153 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.169159 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.169371 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.169439 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.169541 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.169591 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.169707 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.169959 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.170186 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.170512 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.174728 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.175145 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146272 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146295 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146308 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146522 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146544 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146549 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146539 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146579 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146593 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146780 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146776 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146838 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146895 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.146994 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147067 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147333 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147459 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147515 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147988 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148032 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148079 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148088 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148526 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148542 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148625 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148863 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.148951 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149157 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149257 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149262 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149282 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149480 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149589 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149644 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149716 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.149849 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150025 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150016 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150113 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150400 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150405 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.150790 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.151035 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.151307 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.151351 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.152275 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.152457 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.152615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.152935 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.153277 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.153491 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.153593 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.153202 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.153980 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.154008 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.154584 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.154594 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.154666 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.154760 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.154842 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.155532 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.155758 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.155966 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156038 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156088 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156395 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156471 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156616 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.156895 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.157396 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.157729 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.157748 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.157855 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.157994 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.158011 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.158268 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.158399 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.158898 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.159000 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.159521 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.160151 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.175524 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.175525 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.175847 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.175994 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.176415 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.176815 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.176920 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.177154 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.179625 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.147150 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.179785 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.179835 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.179889 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.180055 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.180795 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.182840 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.183902 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.184330 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.184475 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.184594 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.184903 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.185138 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.185411 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.183876 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.185538 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.185656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.183376 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.185656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.185788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.185853 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.185887 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.185996 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.186184 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.187918 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.188660 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.189164 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.193446 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.193446 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.193510 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.193702 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.193966 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.194013 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.194364 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.191290 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.191617 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.191680 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.195142 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.192265 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:06:04.692140957 +0000 UTC m=+24.746125876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.195383 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.195489 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:04.695429722 +0000 UTC m=+24.749414621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.192390 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.192742 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.193076 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.195743 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.195939 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.196110 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.196114 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.196266 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.199659 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:04.699621742 +0000 UTC m=+24.753606641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.199698 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.199705 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:04.699687094 +0000 UTC m=+24.753671993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.199974 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.201768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.202313 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.203743 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.203932 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.204641 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.205117 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.205779 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.206008 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.206298 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.206376 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.206795 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.206974 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.207074 4749 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.208105 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.209382 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.209435 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.211512 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.212154 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.213955 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.214279 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.217335 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.217787 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.217834 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.218120 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.218346 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.218372 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.218387 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.218560 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:04.718533837 +0000 UTC m=+24.772518736 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.227245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.227308 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.227769 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.227823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.227946 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.228093 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.228235 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.228884 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.234814 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.235614 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.235851 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.239126 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.256497 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260253 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260289 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260378 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260391 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260401 4749 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260411 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260421 4749 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260429 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260438 4749 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260448 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260458 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260468 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260478 4749 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260487 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260495 4749 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260504 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260514 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260523 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260532 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260542 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260550 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260561 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260571 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260580 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260589 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260600 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260613 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260623 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260634 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260643 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260652 4749 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260661 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260669 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260677 4749 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260686 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260696 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260704 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260713 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260722 4749 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260731 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260742 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260751 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260759 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260768 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260778 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260787 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260796 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260805 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260814 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260824 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260833 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260842 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260852 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260861 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260869 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260879 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260887 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260896 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260905 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260914 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260923 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260934 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260941 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260954 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260962 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260971 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260979 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260988 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.260997 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261006 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261015 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261023 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261032 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261042 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261050 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261058 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261068 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261080 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261091 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261100 4749 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261110 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261120 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261129 4749 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261137 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261145 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261154 4749 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261162 4749 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261171 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261179 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261187 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261196 4749 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261205 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261237 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261246 4749 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261255 4749 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261266 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261275 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261283 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261292 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261302 4749 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261309 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261318 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261326 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261334 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261343 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261352 4749 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261362 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261370 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261379 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261388 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261398 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261413 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261421 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261430 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261439 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261447 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261455 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261463 4749 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261470 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261479 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261488 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261497 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261506 4749 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261514 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261521 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261529 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261538 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261548 4749 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261557 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261565 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261573 4749 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261581 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261589 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261597 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261606 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261613 4749 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261621 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261628 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261637 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261644 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261652 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261663 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261671 4749 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261679 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261687 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261696 4749 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261705 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261714 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261722 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261730 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261739 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261747 4749 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261756 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261764 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261772 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261779 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261787 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261795 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261803 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261813 4749 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261822 4749 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261831 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261839 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261846 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261855 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261863 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261871 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261879 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261887 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261896 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261904 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261912 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261920 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261931 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261939 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261947 4749 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261955 4749 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261963 4749 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261972 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261980 4749 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261989 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.261997 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.262005 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.262013 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.262021 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.262031 4749 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.262039 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.262048 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.262057 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.262066 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.262811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.262839 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.263904 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.266578 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.274196 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.285426 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.306655 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.340695 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.365790 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.365820 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.365830 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.379857 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.400150 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.405710 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.406250 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.408093 4749 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98" exitCode=255 Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.408162 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98"} Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.408265 4749 scope.go:117] "RemoveContainer" containerID="82197fb0b337dd1947b9c6679cffe3e4529098e2e164cb72d9df51f69f0b9b32" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.418514 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.421507 4749 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.422135 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.430846 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.430883 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.433124 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.433516 4749 scope.go:117] "RemoveContainer" containerID="b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.433715 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.441400 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.442692 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: W1001 13:06:04.444124 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ef9229bd90e736c1f3d5839e944c2a6a04c29ed254de6fffb6279b68b04d7476 WatchSource:0}: Error finding container ef9229bd90e736c1f3d5839e944c2a6a04c29ed254de6fffb6279b68b04d7476: Status 404 returned error can't find the container with id ef9229bd90e736c1f3d5839e944c2a6a04c29ed254de6fffb6279b68b04d7476 Oct 01 13:06:04 crc kubenswrapper[4749]: W1001 13:06:04.458854 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-7fbbc3f35120309d3d734b1c4bda1c2eea48e1264c45eb92fac918adb565586b WatchSource:0}: Error finding container 7fbbc3f35120309d3d734b1c4bda1c2eea48e1264c45eb92fac918adb565586b: Status 404 returned error can't find the container with id 7fbbc3f35120309d3d734b1c4bda1c2eea48e1264c45eb92fac918adb565586b Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.464480 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.481757 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.495931 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.511457 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.524328 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.536679 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.555580 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.564294 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.577586 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.590263 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82197fb0b337dd1947b9c6679cffe3e4529098e2e164cb72d9df51f69f0b9b32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:05:47Z\\\",\\\"message\\\":\\\"W1001 13:05:46.849340 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:05:46.849656 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759323946 cert, and key in /tmp/serving-cert-604554110/serving-signer.crt, /tmp/serving-cert-604554110/serving-signer.key\\\\nI1001 13:05:47.223077 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:05:47.226382 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:05:47.226685 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:05:47.227789 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-604554110/tls.crt::/tmp/serving-cert-604554110/tls.key\\\\\\\"\\\\nF1001 13:05:47.558307 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.602937 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.768821 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.768923 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.768951 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.768972 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.768990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769029 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:06:05.768988614 +0000 UTC m=+25.822973513 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769081 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769097 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769161 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:05.769140309 +0000 UTC m=+25.823125208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769185 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:05.76917569 +0000 UTC m=+25.823160589 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769286 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769322 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769331 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769336 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769351 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769353 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769395 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:05.769387246 +0000 UTC m=+25.823372135 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:04 crc kubenswrapper[4749]: E1001 13:06:04.769438 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:05.769403606 +0000 UTC m=+25.823388505 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.828057 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6hrtf"] Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.828427 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6hrtf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.830155 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-4tfdz"] Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.830293 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.830482 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.830601 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.830654 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.833684 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.833814 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.833841 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.835823 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.836050 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.843554 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.853235 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.862869 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.872927 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.885446 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82197fb0b337dd1947b9c6679cffe3e4529098e2e164cb72d9df51f69f0b9b32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:05:47Z\\\",\\\"message\\\":\\\"W1001 13:05:46.849340 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:05:46.849656 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759323946 cert, and key in /tmp/serving-cert-604554110/serving-signer.crt, /tmp/serving-cert-604554110/serving-signer.key\\\\nI1001 13:05:47.223077 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:05:47.226382 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:05:47.226685 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:05:47.227789 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-604554110/tls.crt::/tmp/serving-cert-604554110/tls.key\\\\\\\"\\\\nF1001 13:05:47.558307 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.895807 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.903542 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.912794 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.927423 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.949175 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.962247 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.970740 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/94d6a9bc-756a-41ca-9ce3-44b6fc834a78-hosts-file\") pod \"node-resolver-6hrtf\" (UID: \"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\") " pod="openshift-dns/node-resolver-6hrtf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.970805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwsdj\" (UniqueName: \"kubernetes.io/projected/94d6a9bc-756a-41ca-9ce3-44b6fc834a78-kube-api-access-gwsdj\") pod \"node-resolver-6hrtf\" (UID: \"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\") " pod="openshift-dns/node-resolver-6hrtf" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.970878 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c763aedc-e75b-471c-83d7-2c9a87da1aaf-proxy-tls\") pod \"machine-config-daemon-4tfdz\" (UID: \"c763aedc-e75b-471c-83d7-2c9a87da1aaf\") " pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.970910 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c763aedc-e75b-471c-83d7-2c9a87da1aaf-mcd-auth-proxy-config\") pod \"machine-config-daemon-4tfdz\" (UID: \"c763aedc-e75b-471c-83d7-2c9a87da1aaf\") " pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.970944 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c763aedc-e75b-471c-83d7-2c9a87da1aaf-rootfs\") pod \"machine-config-daemon-4tfdz\" (UID: \"c763aedc-e75b-471c-83d7-2c9a87da1aaf\") " pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.971123 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pchmz\" (UniqueName: \"kubernetes.io/projected/c763aedc-e75b-471c-83d7-2c9a87da1aaf-kube-api-access-pchmz\") pod \"machine-config-daemon-4tfdz\" (UID: \"c763aedc-e75b-471c-83d7-2c9a87da1aaf\") " pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.973740 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.982970 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:04 crc kubenswrapper[4749]: I1001 13:06:04.993576 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82197fb0b337dd1947b9c6679cffe3e4529098e2e164cb72d9df51f69f0b9b32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:05:47Z\\\",\\\"message\\\":\\\"W1001 13:05:46.849340 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:05:46.849656 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759323946 cert, and key in /tmp/serving-cert-604554110/serving-signer.crt, /tmp/serving-cert-604554110/serving-signer.key\\\\nI1001 13:05:47.223077 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:05:47.226382 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:05:47.226685 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:05:47.227789 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-604554110/tls.crt::/tmp/serving-cert-604554110/tls.key\\\\\\\"\\\\nF1001 13:05:47.558307 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.020453 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.039644 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.065848 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.071660 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c763aedc-e75b-471c-83d7-2c9a87da1aaf-proxy-tls\") pod \"machine-config-daemon-4tfdz\" (UID: \"c763aedc-e75b-471c-83d7-2c9a87da1aaf\") " pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.071703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c763aedc-e75b-471c-83d7-2c9a87da1aaf-mcd-auth-proxy-config\") pod \"machine-config-daemon-4tfdz\" (UID: \"c763aedc-e75b-471c-83d7-2c9a87da1aaf\") " pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.071728 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c763aedc-e75b-471c-83d7-2c9a87da1aaf-rootfs\") pod \"machine-config-daemon-4tfdz\" (UID: \"c763aedc-e75b-471c-83d7-2c9a87da1aaf\") " pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.071751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pchmz\" (UniqueName: \"kubernetes.io/projected/c763aedc-e75b-471c-83d7-2c9a87da1aaf-kube-api-access-pchmz\") pod \"machine-config-daemon-4tfdz\" (UID: \"c763aedc-e75b-471c-83d7-2c9a87da1aaf\") " pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.071789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/94d6a9bc-756a-41ca-9ce3-44b6fc834a78-hosts-file\") pod \"node-resolver-6hrtf\" (UID: \"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\") " pod="openshift-dns/node-resolver-6hrtf" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.071808 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwsdj\" (UniqueName: \"kubernetes.io/projected/94d6a9bc-756a-41ca-9ce3-44b6fc834a78-kube-api-access-gwsdj\") pod \"node-resolver-6hrtf\" (UID: \"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\") " pod="openshift-dns/node-resolver-6hrtf" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.072682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c763aedc-e75b-471c-83d7-2c9a87da1aaf-rootfs\") pod \"machine-config-daemon-4tfdz\" (UID: \"c763aedc-e75b-471c-83d7-2c9a87da1aaf\") " pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.072873 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/94d6a9bc-756a-41ca-9ce3-44b6fc834a78-hosts-file\") pod \"node-resolver-6hrtf\" (UID: \"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\") " pod="openshift-dns/node-resolver-6hrtf" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.073257 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c763aedc-e75b-471c-83d7-2c9a87da1aaf-mcd-auth-proxy-config\") pod \"machine-config-daemon-4tfdz\" (UID: \"c763aedc-e75b-471c-83d7-2c9a87da1aaf\") " pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.077721 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c763aedc-e75b-471c-83d7-2c9a87da1aaf-proxy-tls\") pod \"machine-config-daemon-4tfdz\" (UID: \"c763aedc-e75b-471c-83d7-2c9a87da1aaf\") " pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.084814 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.094660 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwsdj\" (UniqueName: \"kubernetes.io/projected/94d6a9bc-756a-41ca-9ce3-44b6fc834a78-kube-api-access-gwsdj\") pod \"node-resolver-6hrtf\" (UID: \"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\") " pod="openshift-dns/node-resolver-6hrtf" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.098613 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pchmz\" (UniqueName: \"kubernetes.io/projected/c763aedc-e75b-471c-83d7-2c9a87da1aaf-kube-api-access-pchmz\") pod \"machine-config-daemon-4tfdz\" (UID: \"c763aedc-e75b-471c-83d7-2c9a87da1aaf\") " pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.100955 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.144455 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6hrtf" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.152555 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.221618 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-8sqjb"] Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.222384 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.224894 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.225610 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.225993 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.226105 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.226438 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.235963 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.236950 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.238330 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.239230 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.239865 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.240108 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.240846 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.241937 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.242571 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.243802 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.244356 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.249774 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.250827 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.252869 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.253506 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.254079 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.258051 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.258926 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.259983 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.260414 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.260625 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.261206 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.262422 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.263332 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.264249 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.265697 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.268499 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.269183 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.270733 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.271264 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.273117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aee52150-78bb-49e5-a5ea-dd237863c810-cnibin\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.273154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aee52150-78bb-49e5-a5ea-dd237863c810-os-release\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.273180 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aee52150-78bb-49e5-a5ea-dd237863c810-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.273209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aee52150-78bb-49e5-a5ea-dd237863c810-cni-binary-copy\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.273257 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aee52150-78bb-49e5-a5ea-dd237863c810-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.273278 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aee52150-78bb-49e5-a5ea-dd237863c810-system-cni-dir\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.273297 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7cdg\" (UniqueName: \"kubernetes.io/projected/aee52150-78bb-49e5-a5ea-dd237863c810-kube-api-access-b7cdg\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.279840 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.280431 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.281435 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.281468 4749 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.281785 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.283674 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.284907 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.285391 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.287284 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.288522 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.291120 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.291838 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.292742 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.293034 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.293571 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.294612 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.295806 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.296485 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.297389 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.297969 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.298953 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.299700 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.300951 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.301470 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.301924 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.302810 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.303410 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.304491 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.308653 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.323453 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.331953 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.350852 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.368179 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.374180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aee52150-78bb-49e5-a5ea-dd237863c810-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.374241 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aee52150-78bb-49e5-a5ea-dd237863c810-system-cni-dir\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.374272 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7cdg\" (UniqueName: \"kubernetes.io/projected/aee52150-78bb-49e5-a5ea-dd237863c810-kube-api-access-b7cdg\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.374321 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aee52150-78bb-49e5-a5ea-dd237863c810-cnibin\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.374353 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aee52150-78bb-49e5-a5ea-dd237863c810-os-release\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.374383 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aee52150-78bb-49e5-a5ea-dd237863c810-cni-binary-copy\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.374422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aee52150-78bb-49e5-a5ea-dd237863c810-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.375056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aee52150-78bb-49e5-a5ea-dd237863c810-cnibin\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.375426 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aee52150-78bb-49e5-a5ea-dd237863c810-system-cni-dir\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.375473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aee52150-78bb-49e5-a5ea-dd237863c810-os-release\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.375915 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aee52150-78bb-49e5-a5ea-dd237863c810-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.376044 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aee52150-78bb-49e5-a5ea-dd237863c810-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.376296 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aee52150-78bb-49e5-a5ea-dd237863c810-cni-binary-copy\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.381510 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82197fb0b337dd1947b9c6679cffe3e4529098e2e164cb72d9df51f69f0b9b32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:05:47Z\\\",\\\"message\\\":\\\"W1001 13:05:46.849340 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:05:46.849656 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759323946 cert, and key in /tmp/serving-cert-604554110/serving-signer.crt, /tmp/serving-cert-604554110/serving-signer.key\\\\nI1001 13:05:47.223077 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:05:47.226382 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:05:47.226685 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:05:47.227789 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-604554110/tls.crt::/tmp/serving-cert-604554110/tls.key\\\\\\\"\\\\nF1001 13:05:47.558307 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.415867 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7cdg\" (UniqueName: \"kubernetes.io/projected/aee52150-78bb-49e5-a5ea-dd237863c810-kube-api-access-b7cdg\") pod \"multus-additional-cni-plugins-8sqjb\" (UID: \"aee52150-78bb-49e5-a5ea-dd237863c810\") " pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.435557 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.447743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5"} Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.447806 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"49f0190aa4eebaef15365a367b2e3af652ce48482d8f4f5357dd928c64a63dae"} Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.448993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7fbbc3f35120309d3d734b1c4bda1c2eea48e1264c45eb92fac918adb565586b"} Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.450043 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6hrtf" event={"ID":"94d6a9bc-756a-41ca-9ce3-44b6fc834a78","Type":"ContainerStarted","Data":"4a8825f3c42291b55dda09d81f3ddbd5f0865f26038f97b71d0119072149686f"} Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.455089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a"} Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.455134 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1"} Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.455148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ef9229bd90e736c1f3d5839e944c2a6a04c29ed254de6fffb6279b68b04d7476"} Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.459047 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91"} Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.459109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"823ad7bab3e025b46b4b17a03e9f68ac064b0badcc5eaf8b32b3c6a1bb323b3f"} Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.463559 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.468837 4749 scope.go:117] "RemoveContainer" containerID="b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98" Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.469000 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.472277 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.488615 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.501559 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.515935 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.531409 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.551817 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.562718 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.573384 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.582946 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.591603 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-nrgp7"] Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.592659 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.594402 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.594696 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fgjjp"] Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.595647 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.596674 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.596948 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.599500 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.599937 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.600413 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.600432 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.600911 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.602093 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.602158 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.624484 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://82197fb0b337dd1947b9c6679cffe3e4529098e2e164cb72d9df51f69f0b9b32\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:05:47Z\\\",\\\"message\\\":\\\"W1001 13:05:46.849340 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 13:05:46.849656 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759323946 cert, and key in /tmp/serving-cert-604554110/serving-signer.crt, /tmp/serving-cert-604554110/serving-signer.key\\\\nI1001 13:05:47.223077 1 observer_polling.go:159] Starting file observer\\\\nW1001 13:05:47.226382 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 13:05:47.226685 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:05:47.227789 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-604554110/tls.crt::/tmp/serving-cert-604554110/tls.key\\\\\\\"\\\\nF1001 13:05:47.558307 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: W1001 13:06:05.635450 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee52150_78bb_49e5_a5ea_dd237863c810.slice/crio-d7e04a89c3ebeb9d0c25a058c6d6c8672fac5159a97121bbd3b7cbf533ccaf68 WatchSource:0}: Error finding container d7e04a89c3ebeb9d0c25a058c6d6c8672fac5159a97121bbd3b7cbf533ccaf68: Status 404 returned error can't find the container with id d7e04a89c3ebeb9d0c25a058c6d6c8672fac5159a97121bbd3b7cbf533ccaf68 Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.641879 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.669983 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.676699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdvrw\" (UniqueName: \"kubernetes.io/projected/33919a9e-1f0d-4127-915d-17d77d78853e-kube-api-access-sdvrw\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.676761 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-multus-cni-dir\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.676784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.676806 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-run-multus-certs\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.676825 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-hostroot\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.676842 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-kubelet\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.676883 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-system-cni-dir\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.676902 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-run-netns\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.676920 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-multus-conf-dir\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.676939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-var-lib-cni-multus\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.676973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-env-overrides\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.676995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovnkube-script-lib\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-var-lib-kubelet\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677043 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-ovn\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677065 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-openvswitch\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-multus-socket-dir-parent\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-run-netns\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677118 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33919a9e-1f0d-4127-915d-17d77d78853e-multus-daemon-config\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677134 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-etc-kubernetes\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677151 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33919a9e-1f0d-4127-915d-17d77d78853e-cni-binary-copy\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677167 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-var-lib-cni-bin\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677185 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-slash\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-var-lib-openvswitch\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677241 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-cnibin\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677262 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-os-release\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677278 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-run-k8s-cni-cncf-io\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677296 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-node-log\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677315 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-cni-netd\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677335 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovnkube-config\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovn-node-metrics-cert\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677369 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-systemd-units\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677383 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-systemd\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677412 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-etc-openvswitch\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677427 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpcbh\" (UniqueName: \"kubernetes.io/projected/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-kube-api-access-lpcbh\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-log-socket\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677459 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-run-ovn-kubernetes\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.677474 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-cni-bin\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.710785 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.751139 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.768176 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778095 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778196 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-run-multus-certs\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778232 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-hostroot\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778251 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-kubelet\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778271 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778289 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-system-cni-dir\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-run-netns\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778326 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-multus-conf-dir\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778346 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778365 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-var-lib-cni-multus\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778383 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-env-overrides\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778400 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovnkube-script-lib\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778417 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-var-lib-kubelet\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-ovn\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778456 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778476 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-openvswitch\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-run-netns\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778513 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-multus-socket-dir-parent\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778534 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33919a9e-1f0d-4127-915d-17d77d78853e-multus-daemon-config\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778552 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-etc-kubernetes\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778572 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33919a9e-1f0d-4127-915d-17d77d78853e-cni-binary-copy\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778589 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-var-lib-cni-bin\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778609 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-slash\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-var-lib-openvswitch\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-cnibin\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778661 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-os-release\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-run-k8s-cni-cncf-io\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778714 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-node-log\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778732 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-cni-netd\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778757 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovnkube-config\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778777 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovn-node-metrics-cert\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-systemd-units\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778811 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-systemd\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778829 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-etc-openvswitch\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpcbh\" (UniqueName: \"kubernetes.io/projected/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-kube-api-access-lpcbh\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778878 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-log-socket\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778898 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-run-ovn-kubernetes\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778915 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-cni-bin\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdvrw\" (UniqueName: \"kubernetes.io/projected/33919a9e-1f0d-4127-915d-17d77d78853e-kube-api-access-sdvrw\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778957 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-multus-cni-dir\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778975 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.778993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.779132 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.779149 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.779160 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.779204 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:07.779189109 +0000 UTC m=+27.833174008 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.779491 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:06:07.779482498 +0000 UTC m=+27.833467397 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.779521 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-run-multus-certs\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.779550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-hostroot\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.779572 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-kubelet\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.779609 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.779633 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:07.779626622 +0000 UTC m=+27.833611521 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.779735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-system-cni-dir\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.779761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-run-netns\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.779784 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-multus-conf-dir\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.779837 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.779859 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:07.779853288 +0000 UTC m=+27.833838197 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.779880 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-var-lib-cni-multus\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.780344 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-env-overrides\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.780763 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovnkube-script-lib\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.780796 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-var-lib-kubelet\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.780820 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-ovn\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.780869 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.780879 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.780886 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:05 crc kubenswrapper[4749]: E1001 13:06:05.780909 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:07.780901749 +0000 UTC m=+27.834886648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.780930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-openvswitch\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.780953 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-run-netns\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.781055 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-multus-socket-dir-parent\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.781514 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33919a9e-1f0d-4127-915d-17d77d78853e-multus-daemon-config\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.781548 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-etc-kubernetes\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.781902 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33919a9e-1f0d-4127-915d-17d77d78853e-cni-binary-copy\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.781933 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-var-lib-cni-bin\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.781956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-slash\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.781979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-var-lib-openvswitch\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.782009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-cnibin\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.782049 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-os-release\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.782071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-host-run-k8s-cni-cncf-io\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.782093 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-node-log\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.782115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-cni-netd\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.782265 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-log-socket\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.782951 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovnkube-config\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.782990 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-run-ovn-kubernetes\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.783016 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-cni-bin\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.783185 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33919a9e-1f0d-4127-915d-17d77d78853e-multus-cni-dir\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.783210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.784873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.784929 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-systemd\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.784956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-systemd-units\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.784980 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-etc-openvswitch\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.785655 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovn-node-metrics-cert\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.803450 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.810980 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.829934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdvrw\" (UniqueName: \"kubernetes.io/projected/33919a9e-1f0d-4127-915d-17d77d78853e-kube-api-access-sdvrw\") pod \"multus-nrgp7\" (UID: \"33919a9e-1f0d-4127-915d-17d77d78853e\") " pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.861929 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpcbh\" (UniqueName: \"kubernetes.io/projected/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-kube-api-access-lpcbh\") pod \"ovnkube-node-fgjjp\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.887975 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.932918 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.947603 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.955407 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nrgp7" Oct 01 13:06:05 crc kubenswrapper[4749]: W1001 13:06:05.970438 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33919a9e_1f0d_4127_915d_17d77d78853e.slice/crio-aaf55bdb8ad3888c56d8fb1eef5f544094a11fd5965fcac5dfda52df3725defb WatchSource:0}: Error finding container aaf55bdb8ad3888c56d8fb1eef5f544094a11fd5965fcac5dfda52df3725defb: Status 404 returned error can't find the container with id aaf55bdb8ad3888c56d8fb1eef5f544094a11fd5965fcac5dfda52df3725defb Oct 01 13:06:05 crc kubenswrapper[4749]: I1001 13:06:05.984975 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:05Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.024451 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.069790 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.095379 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.127161 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.165698 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.211370 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.229239 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.229334 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:06 crc kubenswrapper[4749]: E1001 13:06:06.229708 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.229366 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:06 crc kubenswrapper[4749]: E1001 13:06:06.229880 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:06 crc kubenswrapper[4749]: E1001 13:06:06.229973 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.250239 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.296387 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.337133 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.367404 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.409704 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.448348 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.473092 4749 generic.go:334] "Generic (PLEG): container finished" podID="aee52150-78bb-49e5-a5ea-dd237863c810" containerID="26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767" exitCode=0 Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.473249 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" event={"ID":"aee52150-78bb-49e5-a5ea-dd237863c810","Type":"ContainerDied","Data":"26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767"} Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.473328 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" event={"ID":"aee52150-78bb-49e5-a5ea-dd237863c810","Type":"ContainerStarted","Data":"d7e04a89c3ebeb9d0c25a058c6d6c8672fac5159a97121bbd3b7cbf533ccaf68"} Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.474740 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6"} Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.476858 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrgp7" event={"ID":"33919a9e-1f0d-4127-915d-17d77d78853e","Type":"ContainerStarted","Data":"d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233"} Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.476897 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrgp7" event={"ID":"33919a9e-1f0d-4127-915d-17d77d78853e","Type":"ContainerStarted","Data":"aaf55bdb8ad3888c56d8fb1eef5f544094a11fd5965fcac5dfda52df3725defb"} Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.478563 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127" exitCode=0 Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.478605 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127"} Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.478677 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"5439fc0322b5913fb92e4f8e8001dd36a895646ea758ee48478f6442c95c800c"} Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.480258 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6hrtf" event={"ID":"94d6a9bc-756a-41ca-9ce3-44b6fc834a78","Type":"ContainerStarted","Data":"6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54"} Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.491386 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: E1001 13:06:06.524076 4749 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.573173 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.603037 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.625885 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.673579 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.709645 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.749343 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.791844 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.835246 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.878514 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.912515 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.953811 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:06 crc kubenswrapper[4749]: I1001 13:06:06.999752 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:06Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.027922 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.067211 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.106989 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.148084 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.187464 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.207389 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-xcxs2"] Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.207969 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xcxs2" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.239691 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.239686 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.256809 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.279752 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.297003 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.300023 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/027ade0f-680f-4066-8e28-d362fd24c84a-serviceca\") pod \"node-ca-xcxs2\" (UID: \"027ade0f-680f-4066-8e28-d362fd24c84a\") " pod="openshift-image-registry/node-ca-xcxs2" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.300102 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/027ade0f-680f-4066-8e28-d362fd24c84a-host\") pod \"node-ca-xcxs2\" (UID: \"027ade0f-680f-4066-8e28-d362fd24c84a\") " pod="openshift-image-registry/node-ca-xcxs2" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.300129 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jghf7\" (UniqueName: \"kubernetes.io/projected/027ade0f-680f-4066-8e28-d362fd24c84a-kube-api-access-jghf7\") pod \"node-ca-xcxs2\" (UID: \"027ade0f-680f-4066-8e28-d362fd24c84a\") " pod="openshift-image-registry/node-ca-xcxs2" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.359697 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.386345 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.401235 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/027ade0f-680f-4066-8e28-d362fd24c84a-serviceca\") pod \"node-ca-xcxs2\" (UID: \"027ade0f-680f-4066-8e28-d362fd24c84a\") " pod="openshift-image-registry/node-ca-xcxs2" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.401571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/027ade0f-680f-4066-8e28-d362fd24c84a-host\") pod \"node-ca-xcxs2\" (UID: \"027ade0f-680f-4066-8e28-d362fd24c84a\") " pod="openshift-image-registry/node-ca-xcxs2" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.401642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/027ade0f-680f-4066-8e28-d362fd24c84a-host\") pod \"node-ca-xcxs2\" (UID: \"027ade0f-680f-4066-8e28-d362fd24c84a\") " pod="openshift-image-registry/node-ca-xcxs2" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.401707 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jghf7\" (UniqueName: \"kubernetes.io/projected/027ade0f-680f-4066-8e28-d362fd24c84a-kube-api-access-jghf7\") pod \"node-ca-xcxs2\" (UID: \"027ade0f-680f-4066-8e28-d362fd24c84a\") " pod="openshift-image-registry/node-ca-xcxs2" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.402198 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/027ade0f-680f-4066-8e28-d362fd24c84a-serviceca\") pod \"node-ca-xcxs2\" (UID: \"027ade0f-680f-4066-8e28-d362fd24c84a\") " pod="openshift-image-registry/node-ca-xcxs2" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.432158 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.467773 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jghf7\" (UniqueName: \"kubernetes.io/projected/027ade0f-680f-4066-8e28-d362fd24c84a-kube-api-access-jghf7\") pod \"node-ca-xcxs2\" (UID: \"027ade0f-680f-4066-8e28-d362fd24c84a\") " pod="openshift-image-registry/node-ca-xcxs2" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.484869 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953"} Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.489505 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81"} Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.489571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e"} Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.489594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2"} Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.489610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43"} Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.489626 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4"} Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.491664 4749 generic.go:334] "Generic (PLEG): container finished" podID="aee52150-78bb-49e5-a5ea-dd237863c810" containerID="d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea" exitCode=0 Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.491755 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" event={"ID":"aee52150-78bb-49e5-a5ea-dd237863c810","Type":"ContainerDied","Data":"d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea"} Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.495287 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.520965 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xcxs2" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.527744 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.568381 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.609322 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.650905 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.684113 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.724350 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.765798 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.803844 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.807445 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.807683 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.807766 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:06:11.807728285 +0000 UTC m=+31.861713184 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.807818 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.807890 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.807916 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.807949 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.807996 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.808017 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.808079 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.808099 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:11.808073105 +0000 UTC m=+31.862058014 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.808128 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:11.808118487 +0000 UTC m=+31.862103396 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.808176 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.808191 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.808189 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.808203 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.808249 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:11.80823976 +0000 UTC m=+31.862224669 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:07 crc kubenswrapper[4749]: E1001 13:06:07.808268 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:11.808260271 +0000 UTC m=+31.862245180 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.842942 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.894691 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.933819 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:07 crc kubenswrapper[4749]: I1001 13:06:07.967852 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:07Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.010108 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.054415 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.087110 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.128867 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.168185 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.221288 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.229315 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.229317 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.229475 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:08 crc kubenswrapper[4749]: E1001 13:06:08.229761 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:08 crc kubenswrapper[4749]: E1001 13:06:08.229910 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:08 crc kubenswrapper[4749]: E1001 13:06:08.230214 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.250792 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.250862 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.251739 4749 scope.go:117] "RemoveContainer" containerID="b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98" Oct 01 13:06:08 crc kubenswrapper[4749]: E1001 13:06:08.251949 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.303093 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.332190 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.380293 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.408547 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.454579 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.486554 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.498732 4749 generic.go:334] "Generic (PLEG): container finished" podID="aee52150-78bb-49e5-a5ea-dd237863c810" containerID="08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97" exitCode=0 Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.498804 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" event={"ID":"aee52150-78bb-49e5-a5ea-dd237863c810","Type":"ContainerDied","Data":"08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97"} Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.500396 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xcxs2" event={"ID":"027ade0f-680f-4066-8e28-d362fd24c84a","Type":"ContainerStarted","Data":"df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce"} Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.500431 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xcxs2" event={"ID":"027ade0f-680f-4066-8e28-d362fd24c84a","Type":"ContainerStarted","Data":"e147c1e2ab2bfe3e4609b6c61bd339f0ab507ec7654ef04e05a3750165af4c7e"} Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.506189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b"} Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.529365 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.568325 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.606059 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.647259 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.685917 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.730263 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.767776 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.807545 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.845400 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.889689 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.927426 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:08 crc kubenswrapper[4749]: I1001 13:06:08.969418 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:08Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.003931 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.049472 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.091931 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.128676 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.168922 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.211781 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.259700 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.515743 4749 generic.go:334] "Generic (PLEG): container finished" podID="aee52150-78bb-49e5-a5ea-dd237863c810" containerID="3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327" exitCode=0 Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.515830 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" event={"ID":"aee52150-78bb-49e5-a5ea-dd237863c810","Type":"ContainerDied","Data":"3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327"} Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.549958 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.571028 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.590436 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.618515 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.636158 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.654971 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.670337 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.686401 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.711562 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.726979 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.744532 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.765761 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.779512 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.811318 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:09 crc kubenswrapper[4749]: I1001 13:06:09.845975 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:09Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.229604 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:10 crc kubenswrapper[4749]: E1001 13:06:10.229770 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.229824 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.229904 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:10 crc kubenswrapper[4749]: E1001 13:06:10.230007 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:10 crc kubenswrapper[4749]: E1001 13:06:10.230083 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.525011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9"} Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.528741 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" event={"ID":"aee52150-78bb-49e5-a5ea-dd237863c810","Type":"ContainerStarted","Data":"a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e"} Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.540779 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.556899 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.567543 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.581499 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.598719 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.627764 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.653715 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.671327 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.694014 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.718119 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.735579 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.757882 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.774386 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.791943 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:10 crc kubenswrapper[4749]: I1001 13:06:10.805989 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:10Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.049974 4749 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.052200 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.052280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.052293 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.052461 4749 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.061167 4749 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.061532 4749 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.063320 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.063365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.063381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.063408 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.063426 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.078462 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.082649 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.082693 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.082705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.082724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.082737 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.098196 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.102597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.102666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.102689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.102720 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.102741 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.117289 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.120992 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.121034 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.121044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.121060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.121092 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.136616 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.140912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.140976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.140987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.141005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.141016 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.162138 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.162357 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.164507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.164550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.164566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.164589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.164604 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.254039 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.267509 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.267581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.267601 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.267628 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.267647 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.274617 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.295000 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.313591 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.336799 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.370598 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.370909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.371187 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.371381 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.371500 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.377166 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.398313 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.418807 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.439067 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.470504 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.474487 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.474543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.474557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.474577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.474590 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.497203 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.516903 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.536565 4749 generic.go:334] "Generic (PLEG): container finished" podID="aee52150-78bb-49e5-a5ea-dd237863c810" containerID="a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e" exitCode=0 Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.536583 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" event={"ID":"aee52150-78bb-49e5-a5ea-dd237863c810","Type":"ContainerDied","Data":"a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e"} Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.538413 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.561920 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.577913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.578010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.578033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.578070 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.578096 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.586377 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.610370 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.632286 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.646209 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.669915 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.681821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.681882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.681904 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.681936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.681956 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.685444 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.697799 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.710937 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.731788 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.745128 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.757342 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.773212 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.784731 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.784779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.784791 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.784813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.784827 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.797305 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.812296 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.826184 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.846492 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.854400 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.854704 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:06:19.854657464 +0000 UTC m=+39.908642393 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.854840 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.854930 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.854980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.855034 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.855123 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.855170 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.855193 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.855263 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.855325 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:19.855290672 +0000 UTC m=+39.909275611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.855358 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.855395 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:19.855357644 +0000 UTC m=+39.909342683 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.855401 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.855446 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.855511 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:19.855492998 +0000 UTC m=+39.909477937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.855585 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:11 crc kubenswrapper[4749]: E1001 13:06:11.855690 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:19.855676353 +0000 UTC m=+39.909661252 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.887615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.887799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.887926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.888019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.888104 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.992263 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.992612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.992812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.993026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:11 crc kubenswrapper[4749]: I1001 13:06:11.993261 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:11Z","lastTransitionTime":"2025-10-01T13:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.096862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.096922 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.096935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.096960 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.096973 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:12Z","lastTransitionTime":"2025-10-01T13:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.201109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.201176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.201196 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.201268 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.201294 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:12Z","lastTransitionTime":"2025-10-01T13:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.229704 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.229739 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:12 crc kubenswrapper[4749]: E1001 13:06:12.229884 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.229991 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:12 crc kubenswrapper[4749]: E1001 13:06:12.230313 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:12 crc kubenswrapper[4749]: E1001 13:06:12.230614 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.304176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.304276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.304295 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.304320 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.304340 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:12Z","lastTransitionTime":"2025-10-01T13:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.408660 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.408739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.408764 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.408799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.408819 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:12Z","lastTransitionTime":"2025-10-01T13:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.511046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.511084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.511095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.511109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.511119 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:12Z","lastTransitionTime":"2025-10-01T13:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.543814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e"} Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.544049 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.555315 4749 generic.go:334] "Generic (PLEG): container finished" podID="aee52150-78bb-49e5-a5ea-dd237863c810" containerID="50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332" exitCode=0 Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.555440 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" event={"ID":"aee52150-78bb-49e5-a5ea-dd237863c810","Type":"ContainerDied","Data":"50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332"} Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.563449 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.582128 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.604209 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.613100 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.613294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.613435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.613561 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.613644 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:12Z","lastTransitionTime":"2025-10-01T13:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.622409 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.630808 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.640190 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.661458 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.679580 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.692142 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.704858 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.716445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.716513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.716531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.716562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.716582 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:12Z","lastTransitionTime":"2025-10-01T13:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.719292 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.731932 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.747811 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.764153 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.777948 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.793169 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.811251 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.819558 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.819610 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.819625 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.819652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.819669 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:12Z","lastTransitionTime":"2025-10-01T13:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.834618 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.849861 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.868155 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.887017 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.905841 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.922575 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.925211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.925274 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.925290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.925317 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.925336 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:12Z","lastTransitionTime":"2025-10-01T13:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.943514 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.962035 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.976156 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:12 crc kubenswrapper[4749]: I1001 13:06:12.988677 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.005137 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.027544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.027603 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.027622 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.027652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.027672 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:13Z","lastTransitionTime":"2025-10-01T13:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.032231 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.062692 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.083309 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.133354 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.133414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.133426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.133445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.133457 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:13Z","lastTransitionTime":"2025-10-01T13:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.237145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.237206 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.237239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.237276 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.237292 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:13Z","lastTransitionTime":"2025-10-01T13:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.341263 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.341348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.341371 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.341404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.341428 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:13Z","lastTransitionTime":"2025-10-01T13:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.445075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.445147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.445166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.445194 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.445210 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:13Z","lastTransitionTime":"2025-10-01T13:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.548528 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.548584 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.548601 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.548630 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.548650 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:13Z","lastTransitionTime":"2025-10-01T13:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.567242 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" event={"ID":"aee52150-78bb-49e5-a5ea-dd237863c810","Type":"ContainerStarted","Data":"954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9"} Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.567352 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.568111 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.588503 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.604697 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.609352 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.625477 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.644068 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.651154 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.651258 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.651287 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.651319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.651341 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:13Z","lastTransitionTime":"2025-10-01T13:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.668691 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.702313 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.724141 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.744445 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.753865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.753911 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.753924 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.753942 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.753953 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:13Z","lastTransitionTime":"2025-10-01T13:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.764975 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.797825 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.817760 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.838213 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.856044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.856096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.856105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.856127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.856139 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:13Z","lastTransitionTime":"2025-10-01T13:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.858454 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.885559 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.902719 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.935347 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.954573 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.958646 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.958698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.958712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.958733 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.958747 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:13Z","lastTransitionTime":"2025-10-01T13:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.969962 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:13 crc kubenswrapper[4749]: I1001 13:06:13.983532 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.003860 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:14Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.024113 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:14Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.043042 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:14Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.062283 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.062346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.062365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.062386 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.062400 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:14Z","lastTransitionTime":"2025-10-01T13:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.066908 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:14Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.080251 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:14Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.090410 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:14Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.103184 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:14Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.116370 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:14Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.126138 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:14Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.143420 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:14Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.160579 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:14Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.165189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.165270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.165282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.165300 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.165310 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:14Z","lastTransitionTime":"2025-10-01T13:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.229031 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.229110 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:14 crc kubenswrapper[4749]: E1001 13:06:14.229165 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.229156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:14 crc kubenswrapper[4749]: E1001 13:06:14.229301 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:14 crc kubenswrapper[4749]: E1001 13:06:14.229569 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.268602 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.268657 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.268669 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.268691 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.268703 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:14Z","lastTransitionTime":"2025-10-01T13:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.372935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.373021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.373041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.373071 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.373090 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:14Z","lastTransitionTime":"2025-10-01T13:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.476647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.476732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.476768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.476800 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.476823 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:14Z","lastTransitionTime":"2025-10-01T13:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.570823 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.580135 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.580192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.580211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.580262 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.580282 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:14Z","lastTransitionTime":"2025-10-01T13:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.683401 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.683476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.683490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.683537 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.683550 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:14Z","lastTransitionTime":"2025-10-01T13:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.787424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.787483 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.787505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.787532 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.787551 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:14Z","lastTransitionTime":"2025-10-01T13:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.890945 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.891012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.891030 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.891056 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.891078 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:14Z","lastTransitionTime":"2025-10-01T13:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.993143 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.993190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.993204 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.993239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:14 crc kubenswrapper[4749]: I1001 13:06:14.993254 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:14Z","lastTransitionTime":"2025-10-01T13:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.096562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.096647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.096670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.096701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.096721 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:15Z","lastTransitionTime":"2025-10-01T13:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.199500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.199565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.199577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.199600 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.199615 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:15Z","lastTransitionTime":"2025-10-01T13:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.301822 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.301876 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.301887 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.301905 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.301916 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:15Z","lastTransitionTime":"2025-10-01T13:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.405256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.405304 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.405323 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.405349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.405366 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:15Z","lastTransitionTime":"2025-10-01T13:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.508841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.508902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.508934 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.508964 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.508983 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:15Z","lastTransitionTime":"2025-10-01T13:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.574518 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.611915 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.611983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.612002 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.612029 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.612046 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:15Z","lastTransitionTime":"2025-10-01T13:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.717001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.717079 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.717097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.717124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.717142 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:15Z","lastTransitionTime":"2025-10-01T13:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.820057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.820109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.820121 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.820142 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.820156 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:15Z","lastTransitionTime":"2025-10-01T13:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.923646 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.923725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.923746 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.923778 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:15 crc kubenswrapper[4749]: I1001 13:06:15.923797 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:15Z","lastTransitionTime":"2025-10-01T13:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.026270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.026316 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.026328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.026353 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.026365 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:16Z","lastTransitionTime":"2025-10-01T13:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.129754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.129810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.129862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.129890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.129910 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:16Z","lastTransitionTime":"2025-10-01T13:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.229725 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.229805 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:16 crc kubenswrapper[4749]: E1001 13:06:16.229946 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.229991 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:16 crc kubenswrapper[4749]: E1001 13:06:16.230120 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:16 crc kubenswrapper[4749]: E1001 13:06:16.230276 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.232325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.232364 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.232375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.232390 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.232403 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:16Z","lastTransitionTime":"2025-10-01T13:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.335426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.335485 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.335498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.335520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.335535 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:16Z","lastTransitionTime":"2025-10-01T13:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.438628 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.438680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.438691 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.438714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.438726 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:16Z","lastTransitionTime":"2025-10-01T13:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.542077 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.542141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.542164 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.542191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.542208 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:16Z","lastTransitionTime":"2025-10-01T13:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.580940 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/0.log" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.585197 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e" exitCode=1 Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.585265 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e"} Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.586442 4749 scope.go:117] "RemoveContainer" containerID="d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.607944 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.630275 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.645207 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.645322 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.645349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.645384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.645427 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:16Z","lastTransitionTime":"2025-10-01T13:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.648559 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.667327 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.688829 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.726461 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.748688 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.748907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.749008 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.749093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.749168 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:16Z","lastTransitionTime":"2025-10-01T13:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.756742 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.778642 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.805068 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.839927 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:15Z\\\",\\\"message\\\":\\\"ork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:06:15.755156 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:15.755179 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:15.755303 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:15.755333 6031 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:15.755348 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:15.755357 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:06:15.755367 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:06:15.755401 6031 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:06:15.755412 6031 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:15.755422 6031 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:06:15.755445 6031 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:15.755471 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:06:15.755480 6031 factory.go:656] Stopping watch factory\\\\nI1001 13:06:15.755484 6031 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:15.755502 6031 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.853111 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.853168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.853181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.853206 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.853238 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:16Z","lastTransitionTime":"2025-10-01T13:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.857758 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.875757 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.898268 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.918072 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.935086 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:16Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.957150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.957211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.957265 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.957294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:16 crc kubenswrapper[4749]: I1001 13:06:16.957313 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:16Z","lastTransitionTime":"2025-10-01T13:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.061777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.061866 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.061888 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.061993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.062015 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:17Z","lastTransitionTime":"2025-10-01T13:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.165707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.165819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.165860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.165895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.165916 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:17Z","lastTransitionTime":"2025-10-01T13:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.269705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.269766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.269785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.269811 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.269829 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:17Z","lastTransitionTime":"2025-10-01T13:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.372092 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.372169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.372190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.372249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.372266 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:17Z","lastTransitionTime":"2025-10-01T13:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.476052 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.476106 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.476116 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.476135 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.476147 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:17Z","lastTransitionTime":"2025-10-01T13:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.578518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.578597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.578618 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.578646 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.578668 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:17Z","lastTransitionTime":"2025-10-01T13:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.590688 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/0.log" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.607676 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634"} Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.607858 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.625818 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.647882 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:15Z\\\",\\\"message\\\":\\\"ork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:06:15.755156 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:15.755179 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:15.755303 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:15.755333 6031 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:15.755348 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:15.755357 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:06:15.755367 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:06:15.755401 6031 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:06:15.755412 6031 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:15.755422 6031 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:06:15.755445 6031 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:15.755471 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:06:15.755480 6031 factory.go:656] Stopping watch factory\\\\nI1001 13:06:15.755484 6031 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:15.755502 6031 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.668974 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.681906 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.682172 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.682294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.682400 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.682480 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:17Z","lastTransitionTime":"2025-10-01T13:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.687917 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.706794 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.718700 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.733459 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.752737 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.772319 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.781626 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.785114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.785169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.785183 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.785207 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.785239 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:17Z","lastTransitionTime":"2025-10-01T13:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.795958 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.813411 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.842304 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.857128 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.872110 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:17Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.890689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.890750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.890761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.890782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.890793 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:17Z","lastTransitionTime":"2025-10-01T13:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.994672 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.994746 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.994765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.994797 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:17 crc kubenswrapper[4749]: I1001 13:06:17.994821 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:17Z","lastTransitionTime":"2025-10-01T13:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.098663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.098749 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.098772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.098803 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.098824 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:18Z","lastTransitionTime":"2025-10-01T13:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.203154 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.203253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.203269 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.203291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.203307 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:18Z","lastTransitionTime":"2025-10-01T13:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.229252 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.229371 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:18 crc kubenswrapper[4749]: E1001 13:06:18.229445 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:18 crc kubenswrapper[4749]: E1001 13:06:18.229652 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.229761 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:18 crc kubenswrapper[4749]: E1001 13:06:18.230046 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.301528 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8"] Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.302350 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.306828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.306950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.306971 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.306999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.307020 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:18Z","lastTransitionTime":"2025-10-01T13:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.308825 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.309505 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.328393 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.365124 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:15Z\\\",\\\"message\\\":\\\"ork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:06:15.755156 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:15.755179 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:15.755303 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:15.755333 6031 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:15.755348 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:15.755357 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:06:15.755367 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:06:15.755401 6031 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:06:15.755412 6031 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:15.755422 6031 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:06:15.755445 6031 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:15.755471 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:06:15.755480 6031 factory.go:656] Stopping watch factory\\\\nI1001 13:06:15.755484 6031 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:15.755502 6031 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.386844 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.408656 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.410838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.410886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.410901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.410927 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.410941 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:18Z","lastTransitionTime":"2025-10-01T13:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.427702 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.430618 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad4c39e4-a4e8-48b2-9e95-94d5e106257e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4wct8\" (UID: \"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.430676 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad4c39e4-a4e8-48b2-9e95-94d5e106257e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4wct8\" (UID: \"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.430771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad4c39e4-a4e8-48b2-9e95-94d5e106257e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4wct8\" (UID: \"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.430929 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z87wz\" (UniqueName: \"kubernetes.io/projected/ad4c39e4-a4e8-48b2-9e95-94d5e106257e-kube-api-access-z87wz\") pod \"ovnkube-control-plane-749d76644c-4wct8\" (UID: \"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.443980 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.458043 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.471852 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.486428 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.498740 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.514334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.514389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.514405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.514428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.514442 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:18Z","lastTransitionTime":"2025-10-01T13:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.517777 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.532517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad4c39e4-a4e8-48b2-9e95-94d5e106257e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4wct8\" (UID: \"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.532619 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad4c39e4-a4e8-48b2-9e95-94d5e106257e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4wct8\" (UID: \"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.532681 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad4c39e4-a4e8-48b2-9e95-94d5e106257e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4wct8\" (UID: \"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.532786 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z87wz\" (UniqueName: \"kubernetes.io/projected/ad4c39e4-a4e8-48b2-9e95-94d5e106257e-kube-api-access-z87wz\") pod \"ovnkube-control-plane-749d76644c-4wct8\" (UID: \"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.533667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ad4c39e4-a4e8-48b2-9e95-94d5e106257e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4wct8\" (UID: \"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.533772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ad4c39e4-a4e8-48b2-9e95-94d5e106257e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4wct8\" (UID: \"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.542655 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ad4c39e4-a4e8-48b2-9e95-94d5e106257e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4wct8\" (UID: \"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.544387 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.551117 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z87wz\" (UniqueName: \"kubernetes.io/projected/ad4c39e4-a4e8-48b2-9e95-94d5e106257e-kube-api-access-z87wz\") pod \"ovnkube-control-plane-749d76644c-4wct8\" (UID: \"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.563489 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.585478 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.605166 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.615254 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/1.log" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.615862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.615882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.615890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.615905 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.615918 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:18Z","lastTransitionTime":"2025-10-01T13:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.616253 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/0.log" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.619756 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634" exitCode=1 Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.619804 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634"} Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.619901 4749 scope.go:117] "RemoveContainer" containerID="d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.620568 4749 scope.go:117] "RemoveContainer" containerID="9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634" Oct 01 13:06:18 crc kubenswrapper[4749]: E1001 13:06:18.620758 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.622861 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.641813 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.674960 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:15Z\\\",\\\"message\\\":\\\"ork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:06:15.755156 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:15.755179 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:15.755303 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:15.755333 6031 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:15.755348 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:15.755357 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:06:15.755367 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:06:15.755401 6031 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:06:15.755412 6031 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:15.755422 6031 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:06:15.755445 6031 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:15.755471 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:06:15.755480 6031 factory.go:656] Stopping watch factory\\\\nI1001 13:06:15.755484 6031 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:15.755502 6031 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"ted as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.749106 6175 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.7\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.693245 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.712166 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.718910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.718973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.718992 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.719020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.719038 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:18Z","lastTransitionTime":"2025-10-01T13:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.731006 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.745996 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.762925 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.782850 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.804864 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.820922 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.823369 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.823783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.823813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.823841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.823854 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:18Z","lastTransitionTime":"2025-10-01T13:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.845376 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.869010 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.887406 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.906722 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.928266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.928323 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.928342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.928368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.928385 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:18Z","lastTransitionTime":"2025-10-01T13:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.941574 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.969208 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:18 crc kubenswrapper[4749]: I1001 13:06:18.991153 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:18Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.032013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.032053 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.032066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.032087 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.032100 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:19Z","lastTransitionTime":"2025-10-01T13:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.135980 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.136060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.136082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.136115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.136140 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:19Z","lastTransitionTime":"2025-10-01T13:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.238735 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.238800 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.238819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.238851 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.238871 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:19Z","lastTransitionTime":"2025-10-01T13:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.343418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.343482 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.343501 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.343527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.343548 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:19Z","lastTransitionTime":"2025-10-01T13:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.446874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.447338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.447358 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.447387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.447410 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:19Z","lastTransitionTime":"2025-10-01T13:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.468735 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-mwlpq"] Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.469694 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.469826 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.506059 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:15Z\\\",\\\"message\\\":\\\"ork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:06:15.755156 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:15.755179 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:15.755303 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:15.755333 6031 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:15.755348 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:15.755357 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:06:15.755367 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:06:15.755401 6031 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:06:15.755412 6031 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:15.755422 6031 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:06:15.755445 6031 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:15.755471 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:06:15.755480 6031 factory.go:656] Stopping watch factory\\\\nI1001 13:06:15.755484 6031 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:15.755502 6031 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"ted as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.749106 6175 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.7\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.527633 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.550695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.550764 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.550785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.550813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.550835 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:19Z","lastTransitionTime":"2025-10-01T13:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.563259 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.585102 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.601607 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.619294 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.628666 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/1.log" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.636625 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" event={"ID":"ad4c39e4-a4e8-48b2-9e95-94d5e106257e","Type":"ContainerStarted","Data":"c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.636699 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" event={"ID":"ad4c39e4-a4e8-48b2-9e95-94d5e106257e","Type":"ContainerStarted","Data":"56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.636722 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" event={"ID":"ad4c39e4-a4e8-48b2-9e95-94d5e106257e","Type":"ContainerStarted","Data":"2def02aa61b14a7c087d0bc8036bb6c8c6f939b5ac2efe69e2b3a197be8faf02"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.646141 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.646388 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.646478 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d57dh\" (UniqueName: \"kubernetes.io/projected/27497171-a8cc-4282-8ee6-2f68f768fc69-kube-api-access-d57dh\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.653948 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.653993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.654012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.654035 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.654054 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:19Z","lastTransitionTime":"2025-10-01T13:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.669807 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.691018 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.711758 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.738718 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.747773 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d57dh\" (UniqueName: \"kubernetes.io/projected/27497171-a8cc-4282-8ee6-2f68f768fc69-kube-api-access-d57dh\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.749049 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.749442 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.749678 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs podName:27497171-a8cc-4282-8ee6-2f68f768fc69 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:20.249592177 +0000 UTC m=+40.303577106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs") pod "network-metrics-daemon-mwlpq" (UID: "27497171-a8cc-4282-8ee6-2f68f768fc69") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.757984 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.758044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.758065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.758093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.758111 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:19Z","lastTransitionTime":"2025-10-01T13:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.763936 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.781160 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d57dh\" (UniqueName: \"kubernetes.io/projected/27497171-a8cc-4282-8ee6-2f68f768fc69-kube-api-access-d57dh\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.782867 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.801815 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.827148 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.844532 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.862343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.862396 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.862418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.862444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.862464 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:19Z","lastTransitionTime":"2025-10-01T13:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.870359 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.888252 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.905725 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.922889 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.940648 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.952428 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.952600 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.952704 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.952737 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.952827 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:06:35.952763103 +0000 UTC m=+56.006748042 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.952909 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.952935 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.952924 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.952984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.953052 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:35.95301933 +0000 UTC m=+56.007004269 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.952951 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.953102 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.953320 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.953341 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:35.953280077 +0000 UTC m=+56.007265156 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.953393 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:35.95337294 +0000 UTC m=+56.007357849 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.953385 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.953445 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:19 crc kubenswrapper[4749]: E1001 13:06:19.953586 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:35.953547065 +0000 UTC m=+56.007532104 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.966176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.966282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.966307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.966345 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.966374 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:19Z","lastTransitionTime":"2025-10-01T13:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:19 crc kubenswrapper[4749]: I1001 13:06:19.970783 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:19Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.007362 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:20Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.033622 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:20Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.057759 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:20Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.070181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.070298 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.070319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.070350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.070368 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:20Z","lastTransitionTime":"2025-10-01T13:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.083633 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:20Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.117762 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:15Z\\\",\\\"message\\\":\\\"ork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:06:15.755156 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:15.755179 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:15.755303 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:15.755333 6031 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:15.755348 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:15.755357 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:06:15.755367 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:06:15.755401 6031 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:06:15.755412 6031 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:15.755422 6031 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:06:15.755445 6031 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:15.755471 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:06:15.755480 6031 factory.go:656] Stopping watch factory\\\\nI1001 13:06:15.755484 6031 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:15.755502 6031 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"ted as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.749106 6175 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.7\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:20Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.137107 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:20Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.155740 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:20Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.174049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.174103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.174121 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.174148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.174168 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:20Z","lastTransitionTime":"2025-10-01T13:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.177657 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:20Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.200550 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:20Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.228841 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.228975 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.229032 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.229073 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:20Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: E1001 13:06:20.229166 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:20 crc kubenswrapper[4749]: E1001 13:06:20.229346 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:20 crc kubenswrapper[4749]: E1001 13:06:20.229492 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.251009 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:20Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.256089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:20 crc kubenswrapper[4749]: E1001 13:06:20.256353 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:20 crc kubenswrapper[4749]: E1001 13:06:20.256465 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs podName:27497171-a8cc-4282-8ee6-2f68f768fc69 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:21.25643295 +0000 UTC m=+41.310417879 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs") pod "network-metrics-daemon-mwlpq" (UID: "27497171-a8cc-4282-8ee6-2f68f768fc69") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.268352 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:20Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.278532 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.278576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.278594 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.278621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.278639 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:20Z","lastTransitionTime":"2025-10-01T13:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.382014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.382652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.382829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.382994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.383160 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:20Z","lastTransitionTime":"2025-10-01T13:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.487163 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.487262 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.487282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.487308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.487328 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:20Z","lastTransitionTime":"2025-10-01T13:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.591338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.591437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.591480 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.591506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.591524 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:20Z","lastTransitionTime":"2025-10-01T13:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.695565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.695643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.695664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.695724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.695746 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:20Z","lastTransitionTime":"2025-10-01T13:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.798914 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.798985 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.799000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.799021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.799035 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:20Z","lastTransitionTime":"2025-10-01T13:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.902139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.902273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.902297 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.902327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:20 crc kubenswrapper[4749]: I1001 13:06:20.902349 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:20Z","lastTransitionTime":"2025-10-01T13:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.006159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.006442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.006557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.006655 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.006726 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.109429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.109477 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.109488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.109505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.109517 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.212785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.212841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.212851 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.212871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.212883 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.229300 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:21 crc kubenswrapper[4749]: E1001 13:06:21.229475 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.256679 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.269355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:21 crc kubenswrapper[4749]: E1001 13:06:21.269658 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:21 crc kubenswrapper[4749]: E1001 13:06:21.269799 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs podName:27497171-a8cc-4282-8ee6-2f68f768fc69 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:23.269764816 +0000 UTC m=+43.323749755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs") pod "network-metrics-daemon-mwlpq" (UID: "27497171-a8cc-4282-8ee6-2f68f768fc69") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.275776 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.295994 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.316787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.316863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.316883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.316917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.316936 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.321347 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.346123 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.368895 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.393296 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.415253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.415382 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.415406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.415437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.415460 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.423814 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: E1001 13:06:21.443805 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.445463 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.449829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.449991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.450081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.450280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.450379 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.468415 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: E1001 13:06:21.473778 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.478976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.479102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.479123 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.479155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.479175 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.487362 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: E1001 13:06:21.500807 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.506769 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.506846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.506866 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.506897 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.506920 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.524200 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: E1001 13:06:21.530126 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.536371 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.536443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.536467 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.536499 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.536519 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.553833 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: E1001 13:06:21.561807 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: E1001 13:06:21.562090 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.564819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.564883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.564902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.564932 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.564953 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.577106 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.612612 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:15Z\\\",\\\"message\\\":\\\"ork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:06:15.755156 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:15.755179 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:15.755303 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:15.755333 6031 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:15.755348 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:15.755357 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:06:15.755367 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:06:15.755401 6031 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:06:15.755412 6031 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:15.755422 6031 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:06:15.755445 6031 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:15.755471 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:06:15.755480 6031 factory.go:656] Stopping watch factory\\\\nI1001 13:06:15.755484 6031 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:15.755502 6031 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"ted as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.749106 6175 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.7\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.632173 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.656371 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:21Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.667796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.667854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.667875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.667904 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.667926 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.770915 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.770983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.771001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.771031 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.771053 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.875526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.875577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.875596 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.875621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.875639 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.979204 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.979322 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.979341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.979368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:21 crc kubenswrapper[4749]: I1001 13:06:21.979387 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:21Z","lastTransitionTime":"2025-10-01T13:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.082430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.082513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.082534 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.082566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.082585 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:22Z","lastTransitionTime":"2025-10-01T13:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.189035 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.189109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.189128 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.189157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.189180 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:22Z","lastTransitionTime":"2025-10-01T13:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.229090 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.229124 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:22 crc kubenswrapper[4749]: E1001 13:06:22.229316 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.229346 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:22 crc kubenswrapper[4749]: E1001 13:06:22.229597 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:22 crc kubenswrapper[4749]: E1001 13:06:22.230050 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.230605 4749 scope.go:117] "RemoveContainer" containerID="b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.292744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.292807 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.292819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.292839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.292871 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:22Z","lastTransitionTime":"2025-10-01T13:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.396184 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.396266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.396281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.396303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.396320 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:22Z","lastTransitionTime":"2025-10-01T13:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.499644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.499701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.499714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.499734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.499752 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:22Z","lastTransitionTime":"2025-10-01T13:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.603725 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.603796 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.603815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.603841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.603860 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:22Z","lastTransitionTime":"2025-10-01T13:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.651257 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.653669 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978"} Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.654153 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.681184 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.703431 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.706210 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.706288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.706308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.706338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.706359 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:22Z","lastTransitionTime":"2025-10-01T13:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.724212 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.742157 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.761442 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.783476 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.804937 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.809827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.809882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.809903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.809931 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.809953 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:22Z","lastTransitionTime":"2025-10-01T13:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.827384 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.851375 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.878967 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.902128 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.913579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.913642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.913661 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.913691 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.913714 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:22Z","lastTransitionTime":"2025-10-01T13:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.927686 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:22 crc kubenswrapper[4749]: I1001 13:06:22.951580 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.000572 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:22Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.017094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.017152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.017170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.017204 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.017249 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:23Z","lastTransitionTime":"2025-10-01T13:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.025676 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.061000 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d57bc97bafc80ec253caab6df543e4caa2eba0b0b646a03173795c87c352a11e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:15Z\\\",\\\"message\\\":\\\"ork/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:06:15.755156 6031 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:15.755179 6031 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:15.755303 6031 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:15.755333 6031 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:15.755348 6031 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:15.755357 6031 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 13:06:15.755367 6031 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 13:06:15.755401 6031 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 13:06:15.755412 6031 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:15.755422 6031 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1001 13:06:15.755445 6031 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:15.755471 6031 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 13:06:15.755480 6031 factory.go:656] Stopping watch factory\\\\nI1001 13:06:15.755484 6031 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:15.755502 6031 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"ted as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.749106 6175 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.7\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.080835 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:23Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.120988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.121067 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.121087 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.121123 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.121143 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:23Z","lastTransitionTime":"2025-10-01T13:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.224761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.224824 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.224845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.224875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.224899 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:23Z","lastTransitionTime":"2025-10-01T13:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.229549 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:23 crc kubenswrapper[4749]: E1001 13:06:23.229745 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.298694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:23 crc kubenswrapper[4749]: E1001 13:06:23.298970 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:23 crc kubenswrapper[4749]: E1001 13:06:23.299123 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs podName:27497171-a8cc-4282-8ee6-2f68f768fc69 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:27.299086214 +0000 UTC m=+47.353071183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs") pod "network-metrics-daemon-mwlpq" (UID: "27497171-a8cc-4282-8ee6-2f68f768fc69") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.328533 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.328595 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.328617 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.328643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.328666 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:23Z","lastTransitionTime":"2025-10-01T13:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.432405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.432483 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.432513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.432543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.432565 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:23Z","lastTransitionTime":"2025-10-01T13:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.537431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.537531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.537586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.537615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.537633 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:23Z","lastTransitionTime":"2025-10-01T13:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.640981 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.641063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.641084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.641109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.641129 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:23Z","lastTransitionTime":"2025-10-01T13:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.744935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.745023 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.745042 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.745070 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.745091 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:23Z","lastTransitionTime":"2025-10-01T13:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.848519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.848595 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.848614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.848644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.848667 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:23Z","lastTransitionTime":"2025-10-01T13:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.952051 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.952118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.952137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.952164 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:23 crc kubenswrapper[4749]: I1001 13:06:23.952184 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:23Z","lastTransitionTime":"2025-10-01T13:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.055541 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.055605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.055624 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.055651 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.055671 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:24Z","lastTransitionTime":"2025-10-01T13:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.158961 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.159046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.159067 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.159097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.159117 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:24Z","lastTransitionTime":"2025-10-01T13:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.229045 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.229212 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:24 crc kubenswrapper[4749]: E1001 13:06:24.229288 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.229045 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:24 crc kubenswrapper[4749]: E1001 13:06:24.229470 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:24 crc kubenswrapper[4749]: E1001 13:06:24.229742 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.262924 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.262978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.262996 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.263021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.263040 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:24Z","lastTransitionTime":"2025-10-01T13:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.366406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.366477 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.366496 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.366526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.366546 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:24Z","lastTransitionTime":"2025-10-01T13:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.470059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.470145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.470166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.470199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.470257 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:24Z","lastTransitionTime":"2025-10-01T13:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.574875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.574949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.574969 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.575003 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.575021 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:24Z","lastTransitionTime":"2025-10-01T13:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.678347 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.678414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.678430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.678455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.678475 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:24Z","lastTransitionTime":"2025-10-01T13:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.782241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.782319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.782340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.782366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.782385 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:24Z","lastTransitionTime":"2025-10-01T13:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.885766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.885823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.885838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.885860 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.885878 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:24Z","lastTransitionTime":"2025-10-01T13:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.989674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.989741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.989754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.989779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:24 crc kubenswrapper[4749]: I1001 13:06:24.989795 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:24Z","lastTransitionTime":"2025-10-01T13:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.092914 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.092978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.092995 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.093023 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.093041 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:25Z","lastTransitionTime":"2025-10-01T13:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.196376 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.196460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.196480 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.196509 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.196529 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:25Z","lastTransitionTime":"2025-10-01T13:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.229478 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:25 crc kubenswrapper[4749]: E1001 13:06:25.229789 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.299070 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.299119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.299130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.299147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.299162 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:25Z","lastTransitionTime":"2025-10-01T13:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.402975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.403060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.403082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.403113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.403133 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:25Z","lastTransitionTime":"2025-10-01T13:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.507310 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.507398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.507426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.507463 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.507491 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:25Z","lastTransitionTime":"2025-10-01T13:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.610885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.610965 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.610984 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.611012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.611029 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:25Z","lastTransitionTime":"2025-10-01T13:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.714428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.714522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.714541 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.714571 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.714592 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:25Z","lastTransitionTime":"2025-10-01T13:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.817057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.817113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.817123 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.817146 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.817158 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:25Z","lastTransitionTime":"2025-10-01T13:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.919871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.920401 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.920500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.920601 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:25 crc kubenswrapper[4749]: I1001 13:06:25.920682 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:25Z","lastTransitionTime":"2025-10-01T13:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.024343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.024410 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.024429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.024461 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.024481 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:26Z","lastTransitionTime":"2025-10-01T13:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.128015 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.128075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.128096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.128123 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.128139 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:26Z","lastTransitionTime":"2025-10-01T13:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.229958 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.230013 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.230028 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:26 crc kubenswrapper[4749]: E1001 13:06:26.230305 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:26 crc kubenswrapper[4749]: E1001 13:06:26.230546 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:26 crc kubenswrapper[4749]: E1001 13:06:26.230791 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.232731 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.233357 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.233636 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.233767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.233857 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:26Z","lastTransitionTime":"2025-10-01T13:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.339550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.339814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.339838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.339868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.339887 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:26Z","lastTransitionTime":"2025-10-01T13:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.443819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.443872 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.443888 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.443912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.443930 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:26Z","lastTransitionTime":"2025-10-01T13:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.547496 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.547554 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.547573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.547597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.547616 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:26Z","lastTransitionTime":"2025-10-01T13:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.651180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.651275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.651296 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.651321 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.651341 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:26Z","lastTransitionTime":"2025-10-01T13:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.753737 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.753816 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.753837 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.753867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.753885 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:26Z","lastTransitionTime":"2025-10-01T13:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.857156 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.857298 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.857324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.857358 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.857377 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:26Z","lastTransitionTime":"2025-10-01T13:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.960307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.960369 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.960387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.960416 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:26 crc kubenswrapper[4749]: I1001 13:06:26.960435 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:26Z","lastTransitionTime":"2025-10-01T13:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.063128 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.063176 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.063188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.063206 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.063235 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:27Z","lastTransitionTime":"2025-10-01T13:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.166204 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.166663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.166681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.166709 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.166727 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:27Z","lastTransitionTime":"2025-10-01T13:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.229055 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:27 crc kubenswrapper[4749]: E1001 13:06:27.229494 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.269987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.270067 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.270089 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.270120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.270142 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:27Z","lastTransitionTime":"2025-10-01T13:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.351172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:27 crc kubenswrapper[4749]: E1001 13:06:27.351512 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:27 crc kubenswrapper[4749]: E1001 13:06:27.351676 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs podName:27497171-a8cc-4282-8ee6-2f68f768fc69 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:35.351631033 +0000 UTC m=+55.405616052 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs") pod "network-metrics-daemon-mwlpq" (UID: "27497171-a8cc-4282-8ee6-2f68f768fc69") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.374714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.374804 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.374828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.374862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.374884 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:27Z","lastTransitionTime":"2025-10-01T13:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.477997 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.478069 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.478094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.478125 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.478148 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:27Z","lastTransitionTime":"2025-10-01T13:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.581955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.582018 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.582039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.582065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.582084 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:27Z","lastTransitionTime":"2025-10-01T13:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.684948 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.685013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.685030 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.685056 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.685074 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:27Z","lastTransitionTime":"2025-10-01T13:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.788273 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.788524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.788566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.788602 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.788624 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:27Z","lastTransitionTime":"2025-10-01T13:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.892147 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.892247 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.892266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.892340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.892371 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:27Z","lastTransitionTime":"2025-10-01T13:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.995105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.995174 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.995192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.995245 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:27 crc kubenswrapper[4749]: I1001 13:06:27.995266 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:27Z","lastTransitionTime":"2025-10-01T13:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.098553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.098614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.098629 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.098652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.098666 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:28Z","lastTransitionTime":"2025-10-01T13:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.202717 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.202798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.202821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.202882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.202909 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:28Z","lastTransitionTime":"2025-10-01T13:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.229270 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.229291 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:28 crc kubenswrapper[4749]: E1001 13:06:28.229587 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:28 crc kubenswrapper[4749]: E1001 13:06:28.229744 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.229813 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:28 crc kubenswrapper[4749]: E1001 13:06:28.229945 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.306619 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.306685 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.306702 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.306728 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.306745 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:28Z","lastTransitionTime":"2025-10-01T13:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.410728 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.410790 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.410807 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.410835 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.410852 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:28Z","lastTransitionTime":"2025-10-01T13:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.514957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.515028 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.515046 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.515076 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.515097 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:28Z","lastTransitionTime":"2025-10-01T13:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.618512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.618577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.618593 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.618618 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.618635 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:28Z","lastTransitionTime":"2025-10-01T13:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.721620 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.721672 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.721689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.721717 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.721734 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:28Z","lastTransitionTime":"2025-10-01T13:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.824637 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.824721 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.824746 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.824778 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.824797 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:28Z","lastTransitionTime":"2025-10-01T13:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.928354 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.928428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.928447 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.928475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:28 crc kubenswrapper[4749]: I1001 13:06:28.928492 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:28Z","lastTransitionTime":"2025-10-01T13:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.031914 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.031991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.032006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.032025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.032041 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:29Z","lastTransitionTime":"2025-10-01T13:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.135506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.135586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.135611 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.135678 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.135737 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:29Z","lastTransitionTime":"2025-10-01T13:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.229632 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:29 crc kubenswrapper[4749]: E1001 13:06:29.230109 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.238651 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.238710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.238734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.238767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.238790 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:29Z","lastTransitionTime":"2025-10-01T13:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.342071 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.342144 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.342167 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.342194 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.342246 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:29Z","lastTransitionTime":"2025-10-01T13:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.445404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.445473 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.445495 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.445527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.445554 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:29Z","lastTransitionTime":"2025-10-01T13:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.537281 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.538576 4749 scope.go:117] "RemoveContainer" containerID="9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.548579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.548955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.549114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.549323 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.549497 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:29Z","lastTransitionTime":"2025-10-01T13:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.572068 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.597265 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.620816 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.635712 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.652518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.652745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.652908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.653360 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.653525 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:29Z","lastTransitionTime":"2025-10-01T13:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.655926 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.682496 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/1.log" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.688922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444"} Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.689761 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.691091 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"ted as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.749106 6175 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.7\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.705859 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.728125 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.746606 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.758616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.758694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.758714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.758739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.758760 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:29Z","lastTransitionTime":"2025-10-01T13:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.771795 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.790864 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.805645 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.825118 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.842157 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.862746 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.862819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.862836 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.862859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.862872 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:29Z","lastTransitionTime":"2025-10-01T13:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.863129 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.878027 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.896705 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.921537 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.944490 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.966512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.966575 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.966588 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.966609 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.966621 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:29Z","lastTransitionTime":"2025-10-01T13:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.969593 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:29 crc kubenswrapper[4749]: I1001 13:06:29.987085 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:29Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.013730 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"ted as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.749106 6175 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.7\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.030422 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.044112 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.061438 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.069283 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.069335 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.069364 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.069398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.069413 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:30Z","lastTransitionTime":"2025-10-01T13:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.075484 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.087183 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.099623 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.110640 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.124485 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.138232 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.151194 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.165378 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.172322 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.172388 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.172406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.172434 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.172455 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:30Z","lastTransitionTime":"2025-10-01T13:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.182969 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.229443 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.229610 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.229611 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:30 crc kubenswrapper[4749]: E1001 13:06:30.229795 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:30 crc kubenswrapper[4749]: E1001 13:06:30.229968 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:30 crc kubenswrapper[4749]: E1001 13:06:30.230152 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.275838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.275892 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.275905 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.275925 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.275939 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:30Z","lastTransitionTime":"2025-10-01T13:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.378922 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.378957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.378967 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.378983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.378992 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:30Z","lastTransitionTime":"2025-10-01T13:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.482828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.482884 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.482896 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.482919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.482932 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:30Z","lastTransitionTime":"2025-10-01T13:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.585984 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.586060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.586078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.586110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.586128 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:30Z","lastTransitionTime":"2025-10-01T13:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.689959 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.690032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.690050 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.690078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.690100 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:30Z","lastTransitionTime":"2025-10-01T13:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.694915 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/2.log" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.695935 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/1.log" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.699789 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444" exitCode=1 Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.699847 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444"} Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.699914 4749 scope.go:117] "RemoveContainer" containerID="9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.700990 4749 scope.go:117] "RemoveContainer" containerID="89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444" Oct 01 13:06:30 crc kubenswrapper[4749]: E1001 13:06:30.701309 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.729832 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.753185 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.771257 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.790993 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.793656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.793713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.793731 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.793758 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.793777 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:30Z","lastTransitionTime":"2025-10-01T13:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.828327 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.864919 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.886199 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.896828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.896911 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.896937 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.896975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.897003 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:30Z","lastTransitionTime":"2025-10-01T13:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.910099 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.931635 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.962985 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"ted as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.749106 6175 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.7\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:30Z\\\",\\\"message\\\":\\\"1.EgressIP event handler 8 for removal\\\\nI1001 13:06:30.488710 6389 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 13:06:30.488738 6389 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:06:30.488686 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:30.488802 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:30.488849 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:30.488906 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:30.488936 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:30.488950 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:30.488938 6389 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 13:06:30.488989 6389 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:30.489040 6389 factory.go:656] Stopping watch factory\\\\nI1001 13:06:30.489087 6389 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 13:06:30.489133 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:30.489138 6389 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:30.489249 6389 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:06:30.489404 6389 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:30 crc kubenswrapper[4749]: I1001 13:06:30.982769 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:30Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.000436 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.000504 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.000522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.000550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.000571 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.003852 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.026337 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.052396 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.076442 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.097767 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.102986 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.103054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.103075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.103104 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.103126 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.112617 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.205790 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.205840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.205851 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.205867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.205892 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.229387 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:31 crc kubenswrapper[4749]: E1001 13:06:31.229605 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.252456 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.272753 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.292668 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.322554 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.323129 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.323175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.323190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.323234 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.323249 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.344588 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.369797 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.389054 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.402568 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.418858 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.426158 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.426200 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.426232 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.426253 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.426267 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.435906 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.449169 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.468077 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.486500 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.507626 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.528760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.528820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.528839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.528873 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.528893 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.530653 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.559379 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9378cf9fefbbc95ba00425df2d28df49e302ddca485a7d09770c15470fd75634\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"message\\\":\\\"ted as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.749106 6175 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-config-operator/machine-config-operator]} name:Service_openshift-machine-config-operator/machine-config-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.183:9001:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {5b85277d-d9b7-4a68-8e4e-2b80594d9347}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1001 13:06:17.7\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:30Z\\\",\\\"message\\\":\\\"1.EgressIP event handler 8 for removal\\\\nI1001 13:06:30.488710 6389 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 13:06:30.488738 6389 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:06:30.488686 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:30.488802 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:30.488849 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:30.488906 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:30.488936 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:30.488950 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:30.488938 6389 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 13:06:30.488989 6389 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:30.489040 6389 factory.go:656] Stopping watch factory\\\\nI1001 13:06:30.489087 6389 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 13:06:30.489133 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:30.489138 6389 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:30.489249 6389 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:06:30.489404 6389 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.567887 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.567929 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.567942 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.567966 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.567981 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.578090 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: E1001 13:06:31.584438 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.587999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.588091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.588161 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.588243 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.588337 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: E1001 13:06:31.601781 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.606935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.606995 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.607014 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.607041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.607059 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: E1001 13:06:31.628070 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.633714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.633782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.633802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.633829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.633849 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: E1001 13:06:31.654478 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.659338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.659439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.659499 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.659560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.659640 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: E1001 13:06:31.680725 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: E1001 13:06:31.680992 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.682976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.683055 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.683077 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.683105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.683128 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.706605 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/2.log" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.711579 4749 scope.go:117] "RemoveContainer" containerID="89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444" Oct 01 13:06:31 crc kubenswrapper[4749]: E1001 13:06:31.711826 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.733496 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.750177 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.770034 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.786779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.786845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.786861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.786887 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.786905 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.795612 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.817632 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.840613 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.864152 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.891047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.891120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.891140 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.891169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.891190 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.900680 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.922174 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.957431 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:30Z\\\",\\\"message\\\":\\\"1.EgressIP event handler 8 for removal\\\\nI1001 13:06:30.488710 6389 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 13:06:30.488738 6389 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:06:30.488686 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:30.488802 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:30.488849 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:30.488906 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:30.488936 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:30.488950 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:30.488938 6389 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 13:06:30.488989 6389 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:30.489040 6389 factory.go:656] Stopping watch factory\\\\nI1001 13:06:30.489087 6389 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 13:06:30.489133 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:30.489138 6389 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:30.489249 6389 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:06:30.489404 6389 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.976409 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.994670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.994730 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.994750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.994780 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.994801 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:31Z","lastTransitionTime":"2025-10-01T13:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:31 crc kubenswrapper[4749]: I1001 13:06:31.999761 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:31Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.020491 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.040018 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.057696 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.077626 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.098005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.098065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.098083 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.098112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.098133 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:32Z","lastTransitionTime":"2025-10-01T13:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.101375 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:32Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.201881 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.202779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.202818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.202846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.202859 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:32Z","lastTransitionTime":"2025-10-01T13:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.229645 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.229815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:32 crc kubenswrapper[4749]: E1001 13:06:32.229974 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.230599 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:32 crc kubenswrapper[4749]: E1001 13:06:32.230714 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:32 crc kubenswrapper[4749]: E1001 13:06:32.230873 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.305670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.305731 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.305748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.305774 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.305796 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:32Z","lastTransitionTime":"2025-10-01T13:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.409951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.410035 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.410055 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.410086 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.410108 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:32Z","lastTransitionTime":"2025-10-01T13:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.513066 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.513420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.513619 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.513764 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.513900 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:32Z","lastTransitionTime":"2025-10-01T13:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.618462 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.618546 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.618570 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.618603 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.618627 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:32Z","lastTransitionTime":"2025-10-01T13:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.721744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.721835 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.721867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.721902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.721925 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:32Z","lastTransitionTime":"2025-10-01T13:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.824760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.824820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.824840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.824869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.824890 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:32Z","lastTransitionTime":"2025-10-01T13:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.928504 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.928571 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.928588 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.928615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:32 crc kubenswrapper[4749]: I1001 13:06:32.928635 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:32Z","lastTransitionTime":"2025-10-01T13:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.032210 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.032307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.032322 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.032350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.032368 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:33Z","lastTransitionTime":"2025-10-01T13:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.136327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.136399 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.136440 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.136479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.136499 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:33Z","lastTransitionTime":"2025-10-01T13:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.229946 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:33 crc kubenswrapper[4749]: E1001 13:06:33.230176 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.240047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.240096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.240114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.240140 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.240158 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:33Z","lastTransitionTime":"2025-10-01T13:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.344013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.344073 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.344091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.344123 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.344142 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:33Z","lastTransitionTime":"2025-10-01T13:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.447301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.447368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.447397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.447431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.447452 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:33Z","lastTransitionTime":"2025-10-01T13:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.550707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.550795 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.550820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.550852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.550874 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:33Z","lastTransitionTime":"2025-10-01T13:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.654162 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.654279 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.654305 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.654339 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.654356 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:33Z","lastTransitionTime":"2025-10-01T13:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.757402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.757458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.757476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.757503 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.757557 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:33Z","lastTransitionTime":"2025-10-01T13:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.798822 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.811312 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.832414 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.855227 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.860151 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.860179 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.860189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.860205 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.860233 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:33Z","lastTransitionTime":"2025-10-01T13:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.877685 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.896606 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.919408 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.951960 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:30Z\\\",\\\"message\\\":\\\"1.EgressIP event handler 8 for removal\\\\nI1001 13:06:30.488710 6389 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 13:06:30.488738 6389 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:06:30.488686 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:30.488802 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:30.488849 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:30.488906 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:30.488936 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:30.488950 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:30.488938 6389 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 13:06:30.488989 6389 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:30.489040 6389 factory.go:656] Stopping watch factory\\\\nI1001 13:06:30.489087 6389 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 13:06:30.489133 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:30.489138 6389 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:30.489249 6389 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:06:30.489404 6389 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.963432 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.963493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.963512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.963540 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.963560 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:33Z","lastTransitionTime":"2025-10-01T13:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.980257 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:33 crc kubenswrapper[4749]: I1001 13:06:33.999820 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:33Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.024162 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.048562 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.068079 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.068166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.068192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.068252 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.068275 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:34Z","lastTransitionTime":"2025-10-01T13:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.074408 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.095565 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.152073 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.172168 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.172667 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.172726 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.172745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.172778 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.172798 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:34Z","lastTransitionTime":"2025-10-01T13:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.192645 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.208709 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.229383 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.229450 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.229383 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.229466 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:34Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:34 crc kubenswrapper[4749]: E1001 13:06:34.229599 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:34 crc kubenswrapper[4749]: E1001 13:06:34.229752 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:34 crc kubenswrapper[4749]: E1001 13:06:34.229897 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.276487 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.276559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.276578 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.276611 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.276631 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:34Z","lastTransitionTime":"2025-10-01T13:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.380347 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.380433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.380458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.380493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.380524 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:34Z","lastTransitionTime":"2025-10-01T13:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.484178 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.484324 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.484346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.484380 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.484400 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:34Z","lastTransitionTime":"2025-10-01T13:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.588606 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.588688 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.588706 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.588735 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.588754 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:34Z","lastTransitionTime":"2025-10-01T13:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.692374 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.692449 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.692472 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.692500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.692518 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:34Z","lastTransitionTime":"2025-10-01T13:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.795829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.795884 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.795897 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.795920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.795933 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:34Z","lastTransitionTime":"2025-10-01T13:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.899181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.899278 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.899298 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.899326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:34 crc kubenswrapper[4749]: I1001 13:06:34.899345 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:34Z","lastTransitionTime":"2025-10-01T13:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.002654 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.002704 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.002719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.002743 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.002756 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:35Z","lastTransitionTime":"2025-10-01T13:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.106182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.106249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.106262 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.106280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.106291 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:35Z","lastTransitionTime":"2025-10-01T13:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.209307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.209373 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.209391 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.209418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.209437 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:35Z","lastTransitionTime":"2025-10-01T13:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.229683 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:35 crc kubenswrapper[4749]: E1001 13:06:35.229816 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.312912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.313003 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.313022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.313049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.313066 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:35Z","lastTransitionTime":"2025-10-01T13:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.415538 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.415594 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.415611 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.415634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.415655 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:35Z","lastTransitionTime":"2025-10-01T13:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.445093 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:35 crc kubenswrapper[4749]: E1001 13:06:35.445284 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:35 crc kubenswrapper[4749]: E1001 13:06:35.445383 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs podName:27497171-a8cc-4282-8ee6-2f68f768fc69 nodeName:}" failed. No retries permitted until 2025-10-01 13:06:51.445356919 +0000 UTC m=+71.499341848 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs") pod "network-metrics-daemon-mwlpq" (UID: "27497171-a8cc-4282-8ee6-2f68f768fc69") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.518629 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.518702 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.518721 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.518752 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.518773 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:35Z","lastTransitionTime":"2025-10-01T13:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.621535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.621592 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.621611 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.621636 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.621653 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:35Z","lastTransitionTime":"2025-10-01T13:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.723838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.723913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.723932 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.723964 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.723990 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:35Z","lastTransitionTime":"2025-10-01T13:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.826759 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.826830 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.826847 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.826875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.826907 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:35Z","lastTransitionTime":"2025-10-01T13:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.930135 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.930194 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.930206 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.930268 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:35 crc kubenswrapper[4749]: I1001 13:06:35.930281 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:35Z","lastTransitionTime":"2025-10-01T13:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.033630 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.033701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.033719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.033746 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.033764 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:36Z","lastTransitionTime":"2025-10-01T13:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.052658 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.052819 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.052860 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.052887 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.052919 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053292 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053300 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053324 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053365 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053387 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053421 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053459 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:07:08.053421064 +0000 UTC m=+88.107406003 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053320 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053490 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:07:08.053474786 +0000 UTC m=+88.107459725 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053497 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053523 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:07:08.053507257 +0000 UTC m=+88.107492196 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053548 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:07:08.053536728 +0000 UTC m=+88.107521657 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.053679 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:07:08.053645441 +0000 UTC m=+88.107630380 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.136479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.136576 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.136595 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.136621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.136638 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:36Z","lastTransitionTime":"2025-10-01T13:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.229135 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.229181 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.229168 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.229380 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.229516 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:36 crc kubenswrapper[4749]: E1001 13:06:36.229611 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.239109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.239164 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.239182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.239209 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.239254 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:36Z","lastTransitionTime":"2025-10-01T13:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.342514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.342572 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.342585 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.342605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.342619 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:36Z","lastTransitionTime":"2025-10-01T13:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.445712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.445819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.445835 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.445975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.445997 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:36Z","lastTransitionTime":"2025-10-01T13:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.548552 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.548648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.548670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.548701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.548720 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:36Z","lastTransitionTime":"2025-10-01T13:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.651989 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.652074 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.652088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.652111 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.652124 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:36Z","lastTransitionTime":"2025-10-01T13:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.754832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.754912 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.754943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.755041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.755067 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:36Z","lastTransitionTime":"2025-10-01T13:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.798142 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.815483 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.834003 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.855911 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.862320 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.862383 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.862396 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.862421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.862439 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:36Z","lastTransitionTime":"2025-10-01T13:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.878407 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.896889 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.911431 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.933942 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.954016 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.965645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.965710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.965729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.965757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.965777 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:36Z","lastTransitionTime":"2025-10-01T13:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.976694 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:36 crc kubenswrapper[4749]: I1001 13:06:36.993885 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:36Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.009923 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.045964 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.068439 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.068823 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.068893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.068920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.068952 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.068977 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:37Z","lastTransitionTime":"2025-10-01T13:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.089083 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.109286 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.126121 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2eccb66-7d60-4283-b37a-693319a6ac15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915a776839bc9d041901e3a0e0748d62400bb5a2937a9d081abda2b510d5fc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68424d824195c50677ecfbd88a505009615f95e86db6474fa65639cc3e84f86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ec5d59eb3ea896a6763b20ded1ab1de5dfc99d8416f4011b68058f8552faef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.146152 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.172113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.172175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.172189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.172232 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.172249 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:37Z","lastTransitionTime":"2025-10-01T13:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.178549 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:30Z\\\",\\\"message\\\":\\\"1.EgressIP event handler 8 for removal\\\\nI1001 13:06:30.488710 6389 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 13:06:30.488738 6389 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:06:30.488686 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:30.488802 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:30.488849 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:30.488906 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:30.488936 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:30.488950 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:30.488938 6389 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 13:06:30.488989 6389 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:30.489040 6389 factory.go:656] Stopping watch factory\\\\nI1001 13:06:30.489087 6389 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 13:06:30.489133 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:30.489138 6389 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:30.489249 6389 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:06:30.489404 6389 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:37Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.230130 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:37 crc kubenswrapper[4749]: E1001 13:06:37.230429 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.275589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.275664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.275683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.275712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.275731 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:37Z","lastTransitionTime":"2025-10-01T13:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.378826 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.378887 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.378902 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.378927 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.378941 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:37Z","lastTransitionTime":"2025-10-01T13:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.482410 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.482482 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.482502 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.482530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.482549 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:37Z","lastTransitionTime":"2025-10-01T13:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.592270 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.592317 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.592331 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.592351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.592363 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:37Z","lastTransitionTime":"2025-10-01T13:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.696254 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.696321 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.696338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.696365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.696384 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:37Z","lastTransitionTime":"2025-10-01T13:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.799889 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.799966 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.799986 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.800016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.800041 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:37Z","lastTransitionTime":"2025-10-01T13:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.903957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.904034 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.904053 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.904082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:37 crc kubenswrapper[4749]: I1001 13:06:37.904102 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:37Z","lastTransitionTime":"2025-10-01T13:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.007989 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.008067 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.008089 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.008118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.008136 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:38Z","lastTransitionTime":"2025-10-01T13:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.111827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.111894 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.111908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.111931 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.111946 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:38Z","lastTransitionTime":"2025-10-01T13:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.215506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.215573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.215596 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.215621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.215639 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:38Z","lastTransitionTime":"2025-10-01T13:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.229889 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.229953 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.229979 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:38 crc kubenswrapper[4749]: E1001 13:06:38.230095 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:38 crc kubenswrapper[4749]: E1001 13:06:38.230276 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:38 crc kubenswrapper[4749]: E1001 13:06:38.230470 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.319679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.319775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.319793 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.319818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.319835 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:38Z","lastTransitionTime":"2025-10-01T13:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.423421 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.423482 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.423500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.423527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.423548 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:38Z","lastTransitionTime":"2025-10-01T13:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.526741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.526820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.526847 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.526882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.526905 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:38Z","lastTransitionTime":"2025-10-01T13:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.630658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.630707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.630719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.630741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.630753 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:38Z","lastTransitionTime":"2025-10-01T13:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.734690 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.734745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.734762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.734787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.734808 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:38Z","lastTransitionTime":"2025-10-01T13:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.837433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.837509 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.837527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.837552 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.837572 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:38Z","lastTransitionTime":"2025-10-01T13:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.940399 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.940475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.940499 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.940534 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:38 crc kubenswrapper[4749]: I1001 13:06:38.940556 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:38Z","lastTransitionTime":"2025-10-01T13:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.043420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.043584 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.043612 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.043647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.043670 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:39Z","lastTransitionTime":"2025-10-01T13:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.147461 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.147538 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.147558 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.147586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.147609 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:39Z","lastTransitionTime":"2025-10-01T13:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.229899 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:39 crc kubenswrapper[4749]: E1001 13:06:39.230198 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.249522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.249590 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.249601 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.249617 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.249628 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:39Z","lastTransitionTime":"2025-10-01T13:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.352330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.352378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.352389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.352407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.352418 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:39Z","lastTransitionTime":"2025-10-01T13:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.455110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.455179 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.455199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.455257 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.455283 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:39Z","lastTransitionTime":"2025-10-01T13:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.558729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.558838 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.558865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.558898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.558922 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:39Z","lastTransitionTime":"2025-10-01T13:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.662120 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.662189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.662208 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.662272 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.662293 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:39Z","lastTransitionTime":"2025-10-01T13:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.764434 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.764485 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.764507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.764529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.764545 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:39Z","lastTransitionTime":"2025-10-01T13:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.866710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.866771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.866789 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.866813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.866836 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:39Z","lastTransitionTime":"2025-10-01T13:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.970258 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.970319 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.970338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.970370 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:39 crc kubenswrapper[4749]: I1001 13:06:39.970391 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:39Z","lastTransitionTime":"2025-10-01T13:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.073426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.073505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.073528 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.073558 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.073577 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:40Z","lastTransitionTime":"2025-10-01T13:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.177594 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.177659 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.177676 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.177701 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.177721 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:40Z","lastTransitionTime":"2025-10-01T13:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.229173 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.229307 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.229173 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:40 crc kubenswrapper[4749]: E1001 13:06:40.229366 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:40 crc kubenswrapper[4749]: E1001 13:06:40.229513 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:40 crc kubenswrapper[4749]: E1001 13:06:40.229620 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.281099 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.281177 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.281201 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.281272 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.281296 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:40Z","lastTransitionTime":"2025-10-01T13:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.384842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.384915 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.384940 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.384970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.384990 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:40Z","lastTransitionTime":"2025-10-01T13:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.488645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.488709 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.488723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.488748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.488762 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:40Z","lastTransitionTime":"2025-10-01T13:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.591968 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.592028 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.592039 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.592061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.592074 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:40Z","lastTransitionTime":"2025-10-01T13:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.695015 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.695106 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.695124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.695154 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.695174 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:40Z","lastTransitionTime":"2025-10-01T13:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.798129 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.798280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.798307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.798343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.798365 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:40Z","lastTransitionTime":"2025-10-01T13:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.900834 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.900903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.900921 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.900949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:40 crc kubenswrapper[4749]: I1001 13:06:40.900973 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:40Z","lastTransitionTime":"2025-10-01T13:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.003588 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.003651 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.003669 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.003698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.003716 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.107608 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.107664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.107683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.107707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.107727 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.210582 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.210623 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.210636 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.210654 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.210666 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.229719 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:41 crc kubenswrapper[4749]: E1001 13:06:41.231617 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.247538 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.265104 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.287124 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.305980 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.313670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.313753 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.313773 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.313805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.313823 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.323678 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.343041 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.372699 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.390717 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.407017 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.416773 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.416805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.416814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.416830 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.416839 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.435695 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:30Z\\\",\\\"message\\\":\\\"1.EgressIP event handler 8 for removal\\\\nI1001 13:06:30.488710 6389 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 13:06:30.488738 6389 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:06:30.488686 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:30.488802 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:30.488849 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:30.488906 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:30.488936 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:30.488950 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:30.488938 6389 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 13:06:30.488989 6389 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:30.489040 6389 factory.go:656] Stopping watch factory\\\\nI1001 13:06:30.489087 6389 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 13:06:30.489133 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:30.489138 6389 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:30.489249 6389 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:06:30.489404 6389 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.454829 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.470552 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2eccb66-7d60-4283-b37a-693319a6ac15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915a776839bc9d041901e3a0e0748d62400bb5a2937a9d081abda2b510d5fc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68424d824195c50677ecfbd88a505009615f95e86db6474fa65639cc3e84f86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ec5d59eb3ea896a6763b20ded1ab1de5dfc99d8416f4011b68058f8552faef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.491029 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.508600 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.520819 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.520856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.520893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.520913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.520924 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.525289 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.542901 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.560759 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.575730 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.623694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.623738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.623748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.623767 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.623777 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.766281 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.766316 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.766327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.766346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.766358 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.767514 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.767541 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.767552 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.767565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.767576 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: E1001 13:06:41.786029 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.790075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.790110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.790122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.790141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.790152 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: E1001 13:06:41.803326 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.808199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.808303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.808327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.808357 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.808382 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: E1001 13:06:41.826057 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.830992 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.831081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.831108 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.831141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.831164 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: E1001 13:06:41.852906 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.857836 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.857911 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.857936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.857969 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.857994 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: E1001 13:06:41.878559 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:41Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:41 crc kubenswrapper[4749]: E1001 13:06:41.878904 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.881540 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.881783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.881936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.882079 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.882212 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.985916 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.985986 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.986006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.986036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:41 crc kubenswrapper[4749]: I1001 13:06:41.986055 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:41Z","lastTransitionTime":"2025-10-01T13:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.089622 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.089685 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.089703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.089730 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.089747 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:42Z","lastTransitionTime":"2025-10-01T13:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.193872 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.194190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.194515 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.194671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.194799 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:42Z","lastTransitionTime":"2025-10-01T13:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.228843 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.228851 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:42 crc kubenswrapper[4749]: E1001 13:06:42.229200 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.228918 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:42 crc kubenswrapper[4749]: E1001 13:06:42.229422 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:42 crc kubenswrapper[4749]: E1001 13:06:42.229774 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.297726 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.297797 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.297815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.297843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.297864 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:42Z","lastTransitionTime":"2025-10-01T13:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.400899 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.400969 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.400990 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.401017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.401035 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:42Z","lastTransitionTime":"2025-10-01T13:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.503297 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.503338 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.503346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.503365 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.503374 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:42Z","lastTransitionTime":"2025-10-01T13:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.606190 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.606266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.606280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.606301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.606324 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:42Z","lastTransitionTime":"2025-10-01T13:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.710065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.710136 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.710155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.710208 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.710260 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:42Z","lastTransitionTime":"2025-10-01T13:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.813583 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.813850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.813930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.814017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.814092 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:42Z","lastTransitionTime":"2025-10-01T13:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.917761 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.917977 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.918110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.918211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:42 crc kubenswrapper[4749]: I1001 13:06:42.918329 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:42Z","lastTransitionTime":"2025-10-01T13:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.020803 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.021036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.021123 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.021263 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.021407 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:43Z","lastTransitionTime":"2025-10-01T13:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.125591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.125948 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.126148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.126389 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.126793 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:43Z","lastTransitionTime":"2025-10-01T13:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.229797 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:43 crc kubenswrapper[4749]: E1001 13:06:43.230032 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.230208 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.230287 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.230309 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.230336 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.230354 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:43Z","lastTransitionTime":"2025-10-01T13:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.333754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.333809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.333827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.333857 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.333875 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:43Z","lastTransitionTime":"2025-10-01T13:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.437481 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.437536 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.437553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.437578 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.437596 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:43Z","lastTransitionTime":"2025-10-01T13:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.541568 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.541618 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.541636 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.541663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.541683 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:43Z","lastTransitionTime":"2025-10-01T13:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.644994 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.645068 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.645085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.645113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.645130 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:43Z","lastTransitionTime":"2025-10-01T13:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.748400 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.748442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.748452 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.748467 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.748477 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:43Z","lastTransitionTime":"2025-10-01T13:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.851305 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.851375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.851399 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.851508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.851532 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:43Z","lastTransitionTime":"2025-10-01T13:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.954794 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.954885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.954924 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.954958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:43 crc kubenswrapper[4749]: I1001 13:06:43.954978 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:43Z","lastTransitionTime":"2025-10-01T13:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.057998 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.058112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.058133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.058160 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.058237 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:44Z","lastTransitionTime":"2025-10-01T13:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.160635 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.160680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.160689 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.160705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.160715 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:44Z","lastTransitionTime":"2025-10-01T13:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.229520 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.229562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.229520 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:44 crc kubenswrapper[4749]: E1001 13:06:44.229682 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:44 crc kubenswrapper[4749]: E1001 13:06:44.229848 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:44 crc kubenswrapper[4749]: E1001 13:06:44.229976 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.263063 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.263115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.263132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.263157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.263175 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:44Z","lastTransitionTime":"2025-10-01T13:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.365753 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.365812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.365830 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.365856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.365876 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:44Z","lastTransitionTime":"2025-10-01T13:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.469559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.469904 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.469919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.469938 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.469955 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:44Z","lastTransitionTime":"2025-10-01T13:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.573342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.573397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.573417 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.573442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.573459 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:44Z","lastTransitionTime":"2025-10-01T13:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.675645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.675697 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.675710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.675729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.675741 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:44Z","lastTransitionTime":"2025-10-01T13:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.784283 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.785020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.785060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.785092 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.785110 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:44Z","lastTransitionTime":"2025-10-01T13:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.888616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.888688 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.888705 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.888731 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.888750 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:44Z","lastTransitionTime":"2025-10-01T13:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.991408 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.991469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.991487 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.991513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:44 crc kubenswrapper[4749]: I1001 13:06:44.991531 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:44Z","lastTransitionTime":"2025-10-01T13:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.094439 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.094486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.094498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.094518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.094531 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:45Z","lastTransitionTime":"2025-10-01T13:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.197327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.197382 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.197391 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.197411 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.197423 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:45Z","lastTransitionTime":"2025-10-01T13:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.229814 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:45 crc kubenswrapper[4749]: E1001 13:06:45.230004 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.301269 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.301344 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.301361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.301780 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.301835 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:45Z","lastTransitionTime":"2025-10-01T13:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.404805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.404844 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.404852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.404867 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.404876 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:45Z","lastTransitionTime":"2025-10-01T13:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.506810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.506871 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.506883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.506908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.506924 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:45Z","lastTransitionTime":"2025-10-01T13:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.609622 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.609666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.609675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.609691 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.609713 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:45Z","lastTransitionTime":"2025-10-01T13:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.712407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.712443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.712451 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.712466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.712477 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:45Z","lastTransitionTime":"2025-10-01T13:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.813943 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.814011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.814029 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.814054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.814075 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:45Z","lastTransitionTime":"2025-10-01T13:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.917656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.917722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.917736 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.917758 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:45 crc kubenswrapper[4749]: I1001 13:06:45.917771 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:45Z","lastTransitionTime":"2025-10-01T13:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.020651 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.020715 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.020738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.020771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.020797 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:46Z","lastTransitionTime":"2025-10-01T13:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.123983 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.124047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.124069 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.124103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.124125 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:46Z","lastTransitionTime":"2025-10-01T13:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.227071 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.227117 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.227128 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.227144 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.227153 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:46Z","lastTransitionTime":"2025-10-01T13:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.229454 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.229584 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:46 crc kubenswrapper[4749]: E1001 13:06:46.229689 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.229764 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:46 crc kubenswrapper[4749]: E1001 13:06:46.229878 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:46 crc kubenswrapper[4749]: E1001 13:06:46.230007 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.230650 4749 scope.go:117] "RemoveContainer" containerID="89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444" Oct 01 13:06:46 crc kubenswrapper[4749]: E1001 13:06:46.230889 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.329130 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.329174 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.329185 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.329204 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.329236 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:46Z","lastTransitionTime":"2025-10-01T13:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.432435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.432489 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.432504 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.432533 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.432550 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:46Z","lastTransitionTime":"2025-10-01T13:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.535361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.535424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.535440 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.535466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.535483 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:46Z","lastTransitionTime":"2025-10-01T13:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.639341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.639401 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.639417 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.639477 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.639492 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:46Z","lastTransitionTime":"2025-10-01T13:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.741828 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.741883 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.741896 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.741914 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.742377 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:46Z","lastTransitionTime":"2025-10-01T13:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.845003 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.845059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.845075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.845097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.845113 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:46Z","lastTransitionTime":"2025-10-01T13:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.948256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.948320 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.948335 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.948358 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:46 crc kubenswrapper[4749]: I1001 13:06:46.948370 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:46Z","lastTransitionTime":"2025-10-01T13:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.050813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.050874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.050894 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.050919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.050938 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:47Z","lastTransitionTime":"2025-10-01T13:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.154114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.154398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.154496 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.154587 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.154670 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:47Z","lastTransitionTime":"2025-10-01T13:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.229331 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:47 crc kubenswrapper[4749]: E1001 13:06:47.229563 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.256928 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.257000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.257019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.257054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.257072 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:47Z","lastTransitionTime":"2025-10-01T13:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.359797 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.359840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.359850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.359868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.359879 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:47Z","lastTransitionTime":"2025-10-01T13:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.463282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.463326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.463337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.463354 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.463365 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:47Z","lastTransitionTime":"2025-10-01T13:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.565866 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.565923 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.565937 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.565957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.565972 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:47Z","lastTransitionTime":"2025-10-01T13:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.668442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.668476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.668484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.668499 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.668508 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:47Z","lastTransitionTime":"2025-10-01T13:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.770965 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.771011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.771028 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.771048 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.771059 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:47Z","lastTransitionTime":"2025-10-01T13:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.873584 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.873635 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.873648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.873667 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.873681 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:47Z","lastTransitionTime":"2025-10-01T13:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.976424 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.976484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.976502 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.976528 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:47 crc kubenswrapper[4749]: I1001 13:06:47.976546 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:47Z","lastTransitionTime":"2025-10-01T13:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.079452 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.079500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.079512 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.079535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.079546 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:48Z","lastTransitionTime":"2025-10-01T13:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.183137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.183211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.183261 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.183289 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.183310 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:48Z","lastTransitionTime":"2025-10-01T13:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.228848 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.228942 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:48 crc kubenswrapper[4749]: E1001 13:06:48.229047 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.229346 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:48 crc kubenswrapper[4749]: E1001 13:06:48.229457 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:48 crc kubenswrapper[4749]: E1001 13:06:48.229660 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.285557 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.285628 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.285642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.285656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.285668 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:48Z","lastTransitionTime":"2025-10-01T13:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.388763 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.388802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.388812 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.388830 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.388842 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:48Z","lastTransitionTime":"2025-10-01T13:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.492256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.492321 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.492341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.492367 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.492388 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:48Z","lastTransitionTime":"2025-10-01T13:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.595052 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.595109 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.595129 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.595153 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.595173 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:48Z","lastTransitionTime":"2025-10-01T13:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.697405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.697472 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.697490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.697519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.697539 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:48Z","lastTransitionTime":"2025-10-01T13:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.800285 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.800337 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.800357 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.800380 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.800397 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:48Z","lastTransitionTime":"2025-10-01T13:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.902820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.902847 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.902858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.902870 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:48 crc kubenswrapper[4749]: I1001 13:06:48.902877 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:48Z","lastTransitionTime":"2025-10-01T13:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.004903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.004956 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.004975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.004999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.005018 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:49Z","lastTransitionTime":"2025-10-01T13:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.107791 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.107820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.107830 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.107843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.107852 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:49Z","lastTransitionTime":"2025-10-01T13:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.209638 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.209686 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.209706 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.209729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.209746 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:49Z","lastTransitionTime":"2025-10-01T13:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.229413 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:49 crc kubenswrapper[4749]: E1001 13:06:49.229593 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.312011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.312064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.312079 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.312101 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.312117 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:49Z","lastTransitionTime":"2025-10-01T13:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.415024 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.415074 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.415082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.415097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.415106 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:49Z","lastTransitionTime":"2025-10-01T13:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.517661 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.517740 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.517766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.517802 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.517826 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:49Z","lastTransitionTime":"2025-10-01T13:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.621114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.621165 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.621182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.621207 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.621248 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:49Z","lastTransitionTime":"2025-10-01T13:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.724399 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.724460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.724479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.724506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.724524 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:49Z","lastTransitionTime":"2025-10-01T13:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.826674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.826703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.826710 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.826723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.826735 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:49Z","lastTransitionTime":"2025-10-01T13:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.929358 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.929407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.929423 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.929449 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:49 crc kubenswrapper[4749]: I1001 13:06:49.929466 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:49Z","lastTransitionTime":"2025-10-01T13:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.032972 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.033027 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.033044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.033069 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.033087 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:50Z","lastTransitionTime":"2025-10-01T13:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.136095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.136141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.136152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.136169 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.136180 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:50Z","lastTransitionTime":"2025-10-01T13:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.229020 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.229038 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.229020 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:50 crc kubenswrapper[4749]: E1001 13:06:50.229176 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:50 crc kubenswrapper[4749]: E1001 13:06:50.229260 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:50 crc kubenswrapper[4749]: E1001 13:06:50.229532 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.242542 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.242652 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.242663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.242685 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.242695 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:50Z","lastTransitionTime":"2025-10-01T13:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.345562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.345632 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.345647 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.345671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.345685 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:50Z","lastTransitionTime":"2025-10-01T13:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.448907 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.448963 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.448973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.448991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.449003 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:50Z","lastTransitionTime":"2025-10-01T13:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.551814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.551890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.551900 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.551920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.551932 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:50Z","lastTransitionTime":"2025-10-01T13:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.654876 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.654945 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.654967 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.654996 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.655016 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:50Z","lastTransitionTime":"2025-10-01T13:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.757809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.757877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.757895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.757917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.757933 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:50Z","lastTransitionTime":"2025-10-01T13:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.860588 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.860651 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.860664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.860688 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.860704 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:50Z","lastTransitionTime":"2025-10-01T13:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.963286 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.963343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.963355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.963375 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:50 crc kubenswrapper[4749]: I1001 13:06:50.963386 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:50Z","lastTransitionTime":"2025-10-01T13:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.066138 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.066199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.066246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.066275 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.066294 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:51Z","lastTransitionTime":"2025-10-01T13:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.168507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.168543 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.168551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.168567 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.168576 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:51Z","lastTransitionTime":"2025-10-01T13:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.229588 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:51 crc kubenswrapper[4749]: E1001 13:06:51.229802 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.241708 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2eccb66-7d60-4283-b37a-693319a6ac15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915a776839bc9d041901e3a0e0748d62400bb5a2937a9d081abda2b510d5fc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68424d824195c50677ecfbd88a505009615f95e86db6474fa65639cc3e84f86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ec5d59eb3ea896a6763b20ded1ab1de5dfc99d8416f4011b68058f8552faef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.257783 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.270980 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.271016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.271030 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.271051 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.271064 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:51Z","lastTransitionTime":"2025-10-01T13:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.283829 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:30Z\\\",\\\"message\\\":\\\"1.EgressIP event handler 8 for removal\\\\nI1001 13:06:30.488710 6389 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 13:06:30.488738 6389 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:06:30.488686 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:30.488802 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:30.488849 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:30.488906 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:30.488936 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:30.488950 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:30.488938 6389 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 13:06:30.488989 6389 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:30.489040 6389 factory.go:656] Stopping watch factory\\\\nI1001 13:06:30.489087 6389 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 13:06:30.489133 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:30.489138 6389 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:30.489249 6389 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:06:30.489404 6389 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.300965 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.317991 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.334262 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.347722 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.361854 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.372825 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.372856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.372865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.372885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.372899 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:51Z","lastTransitionTime":"2025-10-01T13:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.374207 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.386323 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.402061 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.413164 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.441663 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.475675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.475728 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.475749 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.475770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.475783 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:51Z","lastTransitionTime":"2025-10-01T13:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.492065 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.509376 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.534765 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.543069 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:51 crc kubenswrapper[4749]: E1001 13:06:51.543262 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:51 crc kubenswrapper[4749]: E1001 13:06:51.543354 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs podName:27497171-a8cc-4282-8ee6-2f68f768fc69 nodeName:}" failed. No retries permitted until 2025-10-01 13:07:23.543331623 +0000 UTC m=+103.597316522 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs") pod "network-metrics-daemon-mwlpq" (UID: "27497171-a8cc-4282-8ee6-2f68f768fc69") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.550913 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.564158 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:51Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.577987 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.578045 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.578059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.578081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.578092 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:51Z","lastTransitionTime":"2025-10-01T13:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.680522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.680587 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.680606 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.680635 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.680654 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:51Z","lastTransitionTime":"2025-10-01T13:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.783665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.783720 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.783734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.783758 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.783771 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:51Z","lastTransitionTime":"2025-10-01T13:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.886861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.886918 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.886936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.886963 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.886981 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:51Z","lastTransitionTime":"2025-10-01T13:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.990168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.990255 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.990279 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.990306 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:51 crc kubenswrapper[4749]: I1001 13:06:51.990324 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:51Z","lastTransitionTime":"2025-10-01T13:06:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.093774 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.093842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.093861 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.093888 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.093907 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.197602 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.197664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.197674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.197692 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.197702 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.229157 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.229159 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.229473 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.229574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.229596 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.229605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: E1001 13:06:52.229481 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:52 crc kubenswrapper[4749]: E1001 13:06:52.229309 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.229621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.229686 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: E1001 13:06:52.229735 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:52 crc kubenswrapper[4749]: E1001 13:06:52.242582 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.246446 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.246506 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.246523 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.246555 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.246576 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: E1001 13:06:52.260436 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.263533 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.263569 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.263581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.263602 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.263614 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: E1001 13:06:52.279424 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.283196 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.283277 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.283288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.283301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.283324 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: E1001 13:06:52.293800 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.298453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.298478 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.298488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.298501 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.298512 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: E1001 13:06:52.308687 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: E1001 13:06:52.308941 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.310291 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.310336 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.310354 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.310374 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.310392 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.412957 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.412988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.412999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.413016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.413027 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.515394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.515429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.515443 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.515458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.515473 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.618333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.618391 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.618404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.618425 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.618437 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.721953 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.722060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.722079 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.722107 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.722133 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.808723 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrgp7_33919a9e-1f0d-4127-915d-17d77d78853e/kube-multus/0.log" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.808924 4749 generic.go:334] "Generic (PLEG): container finished" podID="33919a9e-1f0d-4127-915d-17d77d78853e" containerID="d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233" exitCode=1 Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.808988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrgp7" event={"ID":"33919a9e-1f0d-4127-915d-17d77d78853e","Type":"ContainerDied","Data":"d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233"} Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.809770 4749 scope.go:117] "RemoveContainer" containerID="d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.828466 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2eccb66-7d60-4283-b37a-693319a6ac15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915a776839bc9d041901e3a0e0748d62400bb5a2937a9d081abda2b510d5fc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68424d824195c50677ecfbd88a505009615f95e86db6474fa65639cc3e84f86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ec5d59eb3ea896a6763b20ded1ab1de5dfc99d8416f4011b68058f8552faef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.828993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.829021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.829031 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.829049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.829062 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.841417 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"2025-10-01T13:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f\\\\n2025-10-01T13:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f to /host/opt/cni/bin/\\\\n2025-10-01T13:06:07Z [verbose] multus-daemon started\\\\n2025-10-01T13:06:07Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.859797 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:30Z\\\",\\\"message\\\":\\\"1.EgressIP event handler 8 for removal\\\\nI1001 13:06:30.488710 6389 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 13:06:30.488738 6389 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:06:30.488686 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:30.488802 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:30.488849 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:30.488906 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:30.488936 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:30.488950 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:30.488938 6389 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 13:06:30.488989 6389 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:30.489040 6389 factory.go:656] Stopping watch factory\\\\nI1001 13:06:30.489087 6389 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 13:06:30.489133 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:30.489138 6389 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:30.489249 6389 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:06:30.489404 6389 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.873314 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.889197 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.905001 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.921830 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.931882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.931922 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.931935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.931955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.931968 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:52Z","lastTransitionTime":"2025-10-01T13:06:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.934569 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.953122 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.966746 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.981175 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:52 crc kubenswrapper[4749]: I1001 13:06:52.999488 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:52Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.015329 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.033674 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.034565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.034600 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.034614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.034634 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.034646 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:53Z","lastTransitionTime":"2025-10-01T13:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.059743 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.079856 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.094434 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.112054 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.136645 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.136665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.136675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.136691 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.136701 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:53Z","lastTransitionTime":"2025-10-01T13:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.232046 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:53 crc kubenswrapper[4749]: E1001 13:06:53.232195 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.239062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.239097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.239110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.239127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.239141 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:53Z","lastTransitionTime":"2025-10-01T13:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.341254 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.341290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.341298 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.341317 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.341329 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:53Z","lastTransitionTime":"2025-10-01T13:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.443191 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.443244 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.443255 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.443269 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.443279 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:53Z","lastTransitionTime":"2025-10-01T13:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.545083 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.545131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.545140 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.545161 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.545173 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:53Z","lastTransitionTime":"2025-10-01T13:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.646825 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.646874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.646885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.646903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.646917 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:53Z","lastTransitionTime":"2025-10-01T13:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.748267 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.748303 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.748312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.748349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.748359 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:53Z","lastTransitionTime":"2025-10-01T13:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.814086 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrgp7_33919a9e-1f0d-4127-915d-17d77d78853e/kube-multus/0.log" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.814149 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrgp7" event={"ID":"33919a9e-1f0d-4127-915d-17d77d78853e","Type":"ContainerStarted","Data":"b38a7f70ee60a91fce57dde930befc389fc6cda29f06b5461084fb700a449da6"} Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.832853 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2eccb66-7d60-4283-b37a-693319a6ac15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915a776839bc9d041901e3a0e0748d62400bb5a2937a9d081abda2b510d5fc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68424d824195c50677ecfbd88a505009615f95e86db6474fa65639cc3e84f86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ec5d59eb3ea896a6763b20ded1ab1de5dfc99d8416f4011b68058f8552faef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.848747 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38a7f70ee60a91fce57dde930befc389fc6cda29f06b5461084fb700a449da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"2025-10-01T13:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f\\\\n2025-10-01T13:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f to /host/opt/cni/bin/\\\\n2025-10-01T13:06:07Z [verbose] multus-daemon started\\\\n2025-10-01T13:06:07Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.850351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.850469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.850497 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.850526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.850544 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:53Z","lastTransitionTime":"2025-10-01T13:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.880788 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:30Z\\\",\\\"message\\\":\\\"1.EgressIP event handler 8 for removal\\\\nI1001 13:06:30.488710 6389 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 13:06:30.488738 6389 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:06:30.488686 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:30.488802 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:30.488849 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:30.488906 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:30.488936 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:30.488950 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:30.488938 6389 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 13:06:30.488989 6389 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:30.489040 6389 factory.go:656] Stopping watch factory\\\\nI1001 13:06:30.489087 6389 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 13:06:30.489133 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:30.489138 6389 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:30.489249 6389 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:06:30.489404 6389 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.896824 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.914989 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.934093 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.952350 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.953065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.953118 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.953135 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.953162 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.953180 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:53Z","lastTransitionTime":"2025-10-01T13:06:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.974383 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:53 crc kubenswrapper[4749]: I1001 13:06:53.988997 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:53Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.006678 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:54Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.024043 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:54Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.038747 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:54Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.055104 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:54Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.055741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.055813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.055832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.055859 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.055879 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:54Z","lastTransitionTime":"2025-10-01T13:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.077129 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:54Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.094252 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:54Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.113668 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:54Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.134529 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:54Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.158466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.158498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.158533 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.158551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.158560 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:54Z","lastTransitionTime":"2025-10-01T13:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.166532 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:54Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.229734 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.229745 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.229776 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:54 crc kubenswrapper[4749]: E1001 13:06:54.229850 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:54 crc kubenswrapper[4749]: E1001 13:06:54.229990 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:54 crc kubenswrapper[4749]: E1001 13:06:54.230180 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.260813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.260856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.260868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.260891 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.260906 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:54Z","lastTransitionTime":"2025-10-01T13:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.364530 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.364573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.364585 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.364605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.364620 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:54Z","lastTransitionTime":"2025-10-01T13:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.467326 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.467369 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.467383 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.467402 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.467415 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:54Z","lastTransitionTime":"2025-10-01T13:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.570049 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.570121 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.570140 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.570171 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.570190 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:54Z","lastTransitionTime":"2025-10-01T13:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.673414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.673481 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.673501 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.673531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.673549 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:54Z","lastTransitionTime":"2025-10-01T13:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.776578 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.776640 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.776659 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.776683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.776703 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:54Z","lastTransitionTime":"2025-10-01T13:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.879724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.879776 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.879789 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.879808 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.879820 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:54Z","lastTransitionTime":"2025-10-01T13:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.982255 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.982300 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.982314 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.982332 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:54 crc kubenswrapper[4749]: I1001 13:06:54.982345 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:54Z","lastTransitionTime":"2025-10-01T13:06:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.085453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.085518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.085539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.085567 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.085591 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:55Z","lastTransitionTime":"2025-10-01T13:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.188405 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.188447 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.188457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.188473 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.188483 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:55Z","lastTransitionTime":"2025-10-01T13:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.229002 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:55 crc kubenswrapper[4749]: E1001 13:06:55.229182 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.291331 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.291427 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.291444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.291473 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.291491 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:55Z","lastTransitionTime":"2025-10-01T13:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.393990 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.394057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.394076 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.394102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.394136 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:55Z","lastTransitionTime":"2025-10-01T13:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.497329 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.497380 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.497390 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.497407 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.497417 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:55Z","lastTransitionTime":"2025-10-01T13:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.600533 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.600599 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.600616 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.600641 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.600659 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:55Z","lastTransitionTime":"2025-10-01T13:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.702979 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.703041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.703059 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.703088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.703112 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:55Z","lastTransitionTime":"2025-10-01T13:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.806353 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.806430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.806442 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.806460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.806471 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:55Z","lastTransitionTime":"2025-10-01T13:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.909382 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.909426 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.909437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.909455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:55 crc kubenswrapper[4749]: I1001 13:06:55.909466 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:55Z","lastTransitionTime":"2025-10-01T13:06:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.011905 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.011970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.011990 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.012017 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.012037 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:56Z","lastTransitionTime":"2025-10-01T13:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.119858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.119910 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.119923 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.119944 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.119957 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:56Z","lastTransitionTime":"2025-10-01T13:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.222970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.223034 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.223057 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.223090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.223114 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:56Z","lastTransitionTime":"2025-10-01T13:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.229437 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:56 crc kubenswrapper[4749]: E1001 13:06:56.229581 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.229441 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.229443 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:56 crc kubenswrapper[4749]: E1001 13:06:56.229758 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:56 crc kubenswrapper[4749]: E1001 13:06:56.229900 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.326404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.326462 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.326479 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.326502 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.326521 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:56Z","lastTransitionTime":"2025-10-01T13:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.429786 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.429872 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.429894 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.430357 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.430564 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:56Z","lastTransitionTime":"2025-10-01T13:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.534422 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.534498 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.534521 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.534553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.534576 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:56Z","lastTransitionTime":"2025-10-01T13:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.638630 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.638716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.638772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.638805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.638827 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:56Z","lastTransitionTime":"2025-10-01T13:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.741625 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.742351 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.742391 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.742420 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.742438 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:56Z","lastTransitionTime":"2025-10-01T13:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.846129 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.846293 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.846320 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.846384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.846405 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:56Z","lastTransitionTime":"2025-10-01T13:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.949385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.949444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.949466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.949493 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:56 crc kubenswrapper[4749]: I1001 13:06:56.949511 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:56Z","lastTransitionTime":"2025-10-01T13:06:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.052313 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.052401 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.052415 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.052447 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.052463 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:57Z","lastTransitionTime":"2025-10-01T13:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.155813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.155869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.155879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.155896 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.155906 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:57Z","lastTransitionTime":"2025-10-01T13:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.229999 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:57 crc kubenswrapper[4749]: E1001 13:06:57.230321 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.258569 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.258644 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.258662 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.258694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.258718 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:57Z","lastTransitionTime":"2025-10-01T13:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.361090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.361151 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.361168 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.361193 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.361211 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:57Z","lastTransitionTime":"2025-10-01T13:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.463782 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.463843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.463862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.463887 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.463902 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:57Z","lastTransitionTime":"2025-10-01T13:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.567061 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.567132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.567149 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.567179 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.567197 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:57Z","lastTransitionTime":"2025-10-01T13:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.669826 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.669899 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.669925 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.669961 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.669986 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:57Z","lastTransitionTime":"2025-10-01T13:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.773379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.773465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.773488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.773519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.773542 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:57Z","lastTransitionTime":"2025-10-01T13:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.876745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.876805 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.876817 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.876840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.876854 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:57Z","lastTransitionTime":"2025-10-01T13:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.979885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.979936 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.979949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.979970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:57 crc kubenswrapper[4749]: I1001 13:06:57.979981 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:57Z","lastTransitionTime":"2025-10-01T13:06:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.083435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.083505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.083526 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.083556 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.083577 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:58Z","lastTransitionTime":"2025-10-01T13:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.186876 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.187026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.187047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.187111 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.187133 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:58Z","lastTransitionTime":"2025-10-01T13:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.229633 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.229794 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:06:58 crc kubenswrapper[4749]: E1001 13:06:58.229852 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.229950 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:06:58 crc kubenswrapper[4749]: E1001 13:06:58.230341 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:06:58 crc kubenswrapper[4749]: E1001 13:06:58.232336 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.290770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.290839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.290858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.290886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.290905 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:58Z","lastTransitionTime":"2025-10-01T13:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.393939 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.393997 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.394009 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.394029 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.394042 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:58Z","lastTransitionTime":"2025-10-01T13:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.497149 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.497244 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.497264 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.497292 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.497311 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:58Z","lastTransitionTime":"2025-10-01T13:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.600179 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.600288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.600308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.600333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.600350 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:58Z","lastTransitionTime":"2025-10-01T13:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.703433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.703476 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.703489 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.703513 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.703528 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:58Z","lastTransitionTime":"2025-10-01T13:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.806034 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.806097 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.806115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.806141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.806159 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:58Z","lastTransitionTime":"2025-10-01T13:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.909137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.909203 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.909246 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.909278 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:58 crc kubenswrapper[4749]: I1001 13:06:58.909297 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:58Z","lastTransitionTime":"2025-10-01T13:06:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.012060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.012124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.012142 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.012170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.012196 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:59Z","lastTransitionTime":"2025-10-01T13:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.115692 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.115754 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.115772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.115799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.115818 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:59Z","lastTransitionTime":"2025-10-01T13:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.218762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.218831 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.218856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.218891 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.218917 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:59Z","lastTransitionTime":"2025-10-01T13:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.229156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:06:59 crc kubenswrapper[4749]: E1001 13:06:59.229395 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.230564 4749 scope.go:117] "RemoveContainer" containerID="89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.324001 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.324080 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.324105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.324139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.324162 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:59Z","lastTransitionTime":"2025-10-01T13:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.427445 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.427492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.427508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.427532 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.427550 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:59Z","lastTransitionTime":"2025-10-01T13:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.531850 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.531911 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.531931 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.531959 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.531978 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:59Z","lastTransitionTime":"2025-10-01T13:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.636841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.636882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.636891 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.636909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.636918 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:59Z","lastTransitionTime":"2025-10-01T13:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.740549 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.740607 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.740622 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.740642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.740655 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:59Z","lastTransitionTime":"2025-10-01T13:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.836175 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/2.log" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.838839 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743"} Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.839319 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.842759 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.842807 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.842825 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.842845 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.842864 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:59Z","lastTransitionTime":"2025-10-01T13:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.852954 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2eccb66-7d60-4283-b37a-693319a6ac15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915a776839bc9d041901e3a0e0748d62400bb5a2937a9d081abda2b510d5fc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68424d824195c50677ecfbd88a505009615f95e86db6474fa65639cc3e84f86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ec5d59eb3ea896a6763b20ded1ab1de5dfc99d8416f4011b68058f8552faef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.864018 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38a7f70ee60a91fce57dde930befc389fc6cda29f06b5461084fb700a449da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"2025-10-01T13:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f\\\\n2025-10-01T13:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f to /host/opt/cni/bin/\\\\n2025-10-01T13:06:07Z [verbose] multus-daemon started\\\\n2025-10-01T13:06:07Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.879284 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:30Z\\\",\\\"message\\\":\\\"1.EgressIP event handler 8 for removal\\\\nI1001 13:06:30.488710 6389 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 13:06:30.488738 6389 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:06:30.488686 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:30.488802 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:30.488849 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:30.488906 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:30.488936 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:30.488950 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:30.488938 6389 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 13:06:30.488989 6389 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:30.489040 6389 factory.go:656] Stopping watch factory\\\\nI1001 13:06:30.489087 6389 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 13:06:30.489133 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:30.489138 6389 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:30.489249 6389 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:06:30.489404 6389 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.895181 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.909470 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.925172 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.937236 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.946507 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.946541 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.946551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.946569 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.946581 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:06:59Z","lastTransitionTime":"2025-10-01T13:06:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.949499 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.959044 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.974189 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.986391 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:06:59 crc kubenswrapper[4749]: I1001 13:06:59.998320 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:06:59Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.009690 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.023867 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.034545 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.046780 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.048631 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.048690 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.048702 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.048722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.048735 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:00Z","lastTransitionTime":"2025-10-01T13:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.065490 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.089880 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.151660 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.152023 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.152099 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.152204 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.152313 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:00Z","lastTransitionTime":"2025-10-01T13:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.229699 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.229995 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:00 crc kubenswrapper[4749]: E1001 13:07:00.230144 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:00 crc kubenswrapper[4749]: E1001 13:07:00.230199 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.230300 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:00 crc kubenswrapper[4749]: E1001 13:07:00.230470 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.255643 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.255937 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.256036 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.256134 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.256254 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:00Z","lastTransitionTime":"2025-10-01T13:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.359595 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.359872 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.360018 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.360349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.360561 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:00Z","lastTransitionTime":"2025-10-01T13:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.463580 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.463909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.463999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.464105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.464201 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:00Z","lastTransitionTime":"2025-10-01T13:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.566933 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.567020 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.567044 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.567073 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.567090 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:00Z","lastTransitionTime":"2025-10-01T13:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.670775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.670846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.670863 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.670893 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.670912 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:00Z","lastTransitionTime":"2025-10-01T13:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.773093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.773155 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.773175 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.773202 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.773249 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:00Z","lastTransitionTime":"2025-10-01T13:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.845128 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/3.log" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.846789 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/2.log" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.850888 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743" exitCode=1 Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.850957 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743"} Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.851024 4749 scope.go:117] "RemoveContainer" containerID="89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.852558 4749 scope.go:117] "RemoveContainer" containerID="f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743" Oct 01 13:07:00 crc kubenswrapper[4749]: E1001 13:07:00.854447 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.875920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.876005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.876030 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.876064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.876088 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:00Z","lastTransitionTime":"2025-10-01T13:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.879077 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.899447 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.920453 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.941691 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2eccb66-7d60-4283-b37a-693319a6ac15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915a776839bc9d041901e3a0e0748d62400bb5a2937a9d081abda2b510d5fc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68424d824195c50677ecfbd88a505009615f95e86db6474fa65639cc3e84f86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ec5d59eb3ea896a6763b20ded1ab1de5dfc99d8416f4011b68058f8552faef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.960601 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38a7f70ee60a91fce57dde930befc389fc6cda29f06b5461084fb700a449da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"2025-10-01T13:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f\\\\n2025-10-01T13:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f to /host/opt/cni/bin/\\\\n2025-10-01T13:06:07Z [verbose] multus-daemon started\\\\n2025-10-01T13:06:07Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.980527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.980562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.980571 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.980586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.980596 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:00Z","lastTransitionTime":"2025-10-01T13:07:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:00 crc kubenswrapper[4749]: I1001 13:07:00.984166 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:30Z\\\",\\\"message\\\":\\\"1.EgressIP event handler 8 for removal\\\\nI1001 13:06:30.488710 6389 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 13:06:30.488738 6389 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:06:30.488686 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:30.488802 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:30.488849 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:30.488906 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:30.488936 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:30.488950 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:30.488938 6389 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 13:06:30.488989 6389 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:30.489040 6389 factory.go:656] Stopping watch factory\\\\nI1001 13:06:30.489087 6389 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 13:06:30.489133 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:30.489138 6389 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:30.489249 6389 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:06:30.489404 6389 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:07:00Z\\\",\\\"message\\\":\\\"ndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:07:00.173251 6749 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:07:00.173313 6749 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:07:00.173294 6749 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:07:00.173447 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1001 13:07:00.180856 6749 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1001 13:07:00.180890 6749 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1001 13:07:00.181036 6749 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:07:00.181123 6749 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:07:00.181338 6749 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.002316 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:00Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.017025 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.038934 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.058249 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.077813 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.082958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.083026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.083052 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.083084 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.083130 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:01Z","lastTransitionTime":"2025-10-01T13:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.095033 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.107196 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.121434 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.133670 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.145945 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.160767 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.182622 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.185395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.185462 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.185477 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.185500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.185514 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:01Z","lastTransitionTime":"2025-10-01T13:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.229033 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:01 crc kubenswrapper[4749]: E1001 13:07:01.229274 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.248428 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.264513 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.278982 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.296933 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.297005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.297026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.297054 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.297073 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:01Z","lastTransitionTime":"2025-10-01T13:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.301248 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.321405 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.343957 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.360743 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.389688 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.400716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.400774 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.400799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.400835 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.400861 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:01Z","lastTransitionTime":"2025-10-01T13:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.414303 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.432439 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.449956 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.470376 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.500626 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.504566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.504736 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.504839 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.504962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.505076 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:01Z","lastTransitionTime":"2025-10-01T13:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.524501 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.545620 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38a7f70ee60a91fce57dde930befc389fc6cda29f06b5461084fb700a449da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"2025-10-01T13:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f\\\\n2025-10-01T13:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f to /host/opt/cni/bin/\\\\n2025-10-01T13:06:07Z [verbose] multus-daemon started\\\\n2025-10-01T13:06:07Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.575049 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89fa46a9a1ca6d5c2f80903118e73d0a5c371237a3e3b8bb64944600b71ae444\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:30Z\\\",\\\"message\\\":\\\"1.EgressIP event handler 8 for removal\\\\nI1001 13:06:30.488710 6389 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1001 13:06:30.488738 6389 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1001 13:06:30.488686 6389 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 13:06:30.488802 6389 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 13:06:30.488849 6389 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 13:06:30.488906 6389 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1001 13:06:30.488936 6389 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1001 13:06:30.488950 6389 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 13:06:30.488938 6389 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 13:06:30.488989 6389 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1001 13:06:30.489040 6389 factory.go:656] Stopping watch factory\\\\nI1001 13:06:30.489087 6389 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 13:06:30.489133 6389 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1001 13:06:30.489138 6389 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:06:30.489249 6389 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:06:30.489404 6389 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:07:00Z\\\",\\\"message\\\":\\\"ndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:07:00.173251 6749 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:07:00.173313 6749 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:07:00.173294 6749 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:07:00.173447 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1001 13:07:00.180856 6749 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1001 13:07:00.180890 6749 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1001 13:07:00.181036 6749 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:07:00.181123 6749 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:07:00.181338 6749 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.593526 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.608930 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.609008 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.609072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.609152 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.609173 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:01Z","lastTransitionTime":"2025-10-01T13:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.616377 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2eccb66-7d60-4283-b37a-693319a6ac15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915a776839bc9d041901e3a0e0748d62400bb5a2937a9d081abda2b510d5fc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68424d824195c50677ecfbd88a505009615f95e86db6474fa65639cc3e84f86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ec5d59eb3ea896a6763b20ded1ab1de5dfc99d8416f4011b68058f8552faef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.711920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.711998 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.712019 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.712050 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.712069 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:01Z","lastTransitionTime":"2025-10-01T13:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.815062 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.815560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.815707 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.815881 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.816017 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:01Z","lastTransitionTime":"2025-10-01T13:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.857992 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/3.log" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.863506 4749 scope.go:117] "RemoveContainer" containerID="f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743" Oct 01 13:07:01 crc kubenswrapper[4749]: E1001 13:07:01.863895 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.896613 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.916093 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.918490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.918544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.918559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.918581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.918599 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:01Z","lastTransitionTime":"2025-10-01T13:07:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.934851 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.949769 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.966783 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2eccb66-7d60-4283-b37a-693319a6ac15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915a776839bc9d041901e3a0e0748d62400bb5a2937a9d081abda2b510d5fc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68424d824195c50677ecfbd88a505009615f95e86db6474fa65639cc3e84f86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ec5d59eb3ea896a6763b20ded1ab1de5dfc99d8416f4011b68058f8552faef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:01 crc kubenswrapper[4749]: I1001 13:07:01.986372 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38a7f70ee60a91fce57dde930befc389fc6cda29f06b5461084fb700a449da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"2025-10-01T13:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f\\\\n2025-10-01T13:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f to /host/opt/cni/bin/\\\\n2025-10-01T13:06:07Z [verbose] multus-daemon started\\\\n2025-10-01T13:06:07Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:01Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.015755 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:07:00Z\\\",\\\"message\\\":\\\"ndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:07:00.173251 6749 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:07:00.173313 6749 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:07:00.173294 6749 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:07:00.173447 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1001 13:07:00.180856 6749 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1001 13:07:00.180890 6749 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1001 13:07:00.181036 6749 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:07:00.181123 6749 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:07:00.181338 6749 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.021595 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.021641 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.021655 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.021677 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.021692 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.030054 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.042029 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.060160 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.074173 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.092046 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.105927 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.124397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.124440 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.124452 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.124471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.124483 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.127881 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.145814 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.160821 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.175482 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.189188 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.227586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.227639 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.227656 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.227682 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.227699 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.229760 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.229820 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:02 crc kubenswrapper[4749]: E1001 13:07:02.229924 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.229939 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:02 crc kubenswrapper[4749]: E1001 13:07:02.230084 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:02 crc kubenswrapper[4749]: E1001 13:07:02.230207 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.330657 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.331013 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.331115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.331361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.331504 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.433822 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.434105 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.434166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.434259 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.434331 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.536842 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.537110 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.537205 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.537361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.537522 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.545095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.545127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.545142 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.545157 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.545172 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: E1001 13:07:02.560557 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.567163 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.567204 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.567225 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.567245 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.567257 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: E1001 13:07:02.586059 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.591011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.591073 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.591091 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.591122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.591142 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: E1001 13:07:02.611356 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.615827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.615877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.615894 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.615939 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.615954 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: E1001 13:07:02.631344 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.635025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.635064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.635081 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.635102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.635118 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: E1001 13:07:02.655073 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:02Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:02 crc kubenswrapper[4749]: E1001 13:07:02.655634 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.657980 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.658161 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.658327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.658474 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.658612 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.761925 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.762005 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.762024 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.762051 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.762072 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.864814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.864875 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.864895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.864921 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.864938 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.968295 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.968374 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.968393 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.968422 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:02 crc kubenswrapper[4749]: I1001 13:07:02.968440 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:02Z","lastTransitionTime":"2025-10-01T13:07:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.071321 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.071359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.071370 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.071394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.071409 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:03Z","lastTransitionTime":"2025-10-01T13:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.174192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.174293 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.174312 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.174343 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.174361 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:03Z","lastTransitionTime":"2025-10-01T13:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.228993 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:03 crc kubenswrapper[4749]: E1001 13:07:03.229212 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.277021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.277085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.277108 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.277145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.277167 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:03Z","lastTransitionTime":"2025-10-01T13:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.384809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.384886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.384908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.384940 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.384961 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:03Z","lastTransitionTime":"2025-10-01T13:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.488090 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.488437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.488581 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.488759 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.488895 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:03Z","lastTransitionTime":"2025-10-01T13:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.593041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.593095 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.593113 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.593138 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.593155 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:03Z","lastTransitionTime":"2025-10-01T13:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.696878 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.696955 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.696973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.701486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.701582 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:03Z","lastTransitionTime":"2025-10-01T13:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.804750 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.804818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.804831 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.804853 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.804868 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:03Z","lastTransitionTime":"2025-10-01T13:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.908736 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.908810 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.908829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.908858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:03 crc kubenswrapper[4749]: I1001 13:07:03.908878 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:03Z","lastTransitionTime":"2025-10-01T13:07:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.012527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.012618 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.012642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.012673 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.012693 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:04Z","lastTransitionTime":"2025-10-01T13:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.115734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.115814 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.115832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.115858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.115877 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:04Z","lastTransitionTime":"2025-10-01T13:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.218745 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.218801 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.218818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.218846 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.218863 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:04Z","lastTransitionTime":"2025-10-01T13:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.229194 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.229203 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:04 crc kubenswrapper[4749]: E1001 13:07:04.229443 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.229200 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:04 crc kubenswrapper[4749]: E1001 13:07:04.229647 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:04 crc kubenswrapper[4749]: E1001 13:07:04.229809 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.322047 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.322112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.322131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.322156 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.322177 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:04Z","lastTransitionTime":"2025-10-01T13:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.425621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.425720 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.425748 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.425783 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.425806 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:04Z","lastTransitionTime":"2025-10-01T13:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.529082 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.529150 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.529167 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.529194 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.529270 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:04Z","lastTransitionTime":"2025-10-01T13:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.633182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.633316 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.633342 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.633374 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.633398 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:04Z","lastTransitionTime":"2025-10-01T13:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.735766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.735824 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.735841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.735865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.735885 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:04Z","lastTransitionTime":"2025-10-01T13:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.839040 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.839137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.839159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.839277 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.839314 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:04Z","lastTransitionTime":"2025-10-01T13:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.942078 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.942159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.942182 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.942210 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:04 crc kubenswrapper[4749]: I1001 13:07:04.942260 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:04Z","lastTransitionTime":"2025-10-01T13:07:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.046060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.046153 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.046171 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.046196 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.046286 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:05Z","lastTransitionTime":"2025-10-01T13:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.149033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.149166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.149184 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.149207 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.149256 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:05Z","lastTransitionTime":"2025-10-01T13:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.229145 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:05 crc kubenswrapper[4749]: E1001 13:07:05.229414 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.252346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.252413 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.252434 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.252457 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.252475 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:05Z","lastTransitionTime":"2025-10-01T13:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.357290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.357600 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.357629 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.357709 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.357794 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:05Z","lastTransitionTime":"2025-10-01T13:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.461356 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.461401 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.461412 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.461430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.461442 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:05Z","lastTransitionTime":"2025-10-01T13:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.565659 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.565723 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.565734 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.565755 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.565772 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:05Z","lastTransitionTime":"2025-10-01T13:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.669289 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.669359 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.669379 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.669412 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.669430 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:05Z","lastTransitionTime":"2025-10-01T13:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.772615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.772690 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.772711 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.772746 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.772768 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:05Z","lastTransitionTime":"2025-10-01T13:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.877070 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.877132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.877145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.877166 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.877179 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:05Z","lastTransitionTime":"2025-10-01T13:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.980738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.981132 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.981142 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.981159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:05 crc kubenswrapper[4749]: I1001 13:07:05.981248 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:05Z","lastTransitionTime":"2025-10-01T13:07:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.084397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.084466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.084484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.084509 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.084527 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:06Z","lastTransitionTime":"2025-10-01T13:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.187619 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.187670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.187682 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.187702 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.187713 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:06Z","lastTransitionTime":"2025-10-01T13:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.229547 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.229547 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.229581 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:06 crc kubenswrapper[4749]: E1001 13:07:06.230406 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:06 crc kubenswrapper[4749]: E1001 13:07:06.230409 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:06 crc kubenswrapper[4749]: E1001 13:07:06.230490 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.290938 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.291012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.291033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.291060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.291079 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:06Z","lastTransitionTime":"2025-10-01T13:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.395683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.395751 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.395770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.395799 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.395818 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:06Z","lastTransitionTime":"2025-10-01T13:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.499843 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.499908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.499928 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.499952 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.499967 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:06Z","lastTransitionTime":"2025-10-01T13:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.603122 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.603188 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.603203 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.603245 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.603261 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:06Z","lastTransitionTime":"2025-10-01T13:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.706583 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.706629 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.706639 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.706655 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.706665 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:06Z","lastTransitionTime":"2025-10-01T13:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.809800 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.809868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.809888 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.809920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.809961 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:06Z","lastTransitionTime":"2025-10-01T13:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.913706 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.913779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.913798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.913827 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:06 crc kubenswrapper[4749]: I1001 13:07:06.913848 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:06Z","lastTransitionTime":"2025-10-01T13:07:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.018654 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.018739 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.018766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.018800 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.018825 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:07Z","lastTransitionTime":"2025-10-01T13:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.122075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.122171 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.122189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.122237 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.122255 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:07Z","lastTransitionTime":"2025-10-01T13:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.225266 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.225325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.225335 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.225366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.225376 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:07Z","lastTransitionTime":"2025-10-01T13:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.229839 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:07 crc kubenswrapper[4749]: E1001 13:07:07.230104 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.327920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.327973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.327982 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.328000 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.328011 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:07Z","lastTransitionTime":"2025-10-01T13:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.430626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.430663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.430671 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.430687 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.430696 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:07Z","lastTransitionTime":"2025-10-01T13:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.533384 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.533448 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.533466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.533495 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.533519 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:07Z","lastTransitionTime":"2025-10-01T13:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.636818 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.636917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.636935 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.636968 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.637017 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:07Z","lastTransitionTime":"2025-10-01T13:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.740816 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.740900 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.740917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.740951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.740974 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:07Z","lastTransitionTime":"2025-10-01T13:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.845133 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.845189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.845198 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.845239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.845254 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:07Z","lastTransitionTime":"2025-10-01T13:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.948006 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.948094 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.948117 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.948148 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:07 crc kubenswrapper[4749]: I1001 13:07:07.948167 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:07Z","lastTransitionTime":"2025-10-01T13:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.051050 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.051102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.051115 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.051131 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.051142 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:08Z","lastTransitionTime":"2025-10-01T13:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.149478 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.149654 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.149618014 +0000 UTC m=+152.203602943 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.149718 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.149776 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.149820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.149898 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.150020 4749 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.150042 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.150077 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.150077 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.150107 4749 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.150117 4749 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.150127 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.150095908 +0000 UTC m=+152.204080827 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.150139 4749 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.150175 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.15015357 +0000 UTC m=+152.204138519 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.150211 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.150195521 +0000 UTC m=+152.204180470 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.150317 4749 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.150382 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.150366836 +0000 UTC m=+152.204351775 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.154477 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.154527 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.154544 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.154566 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.154583 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:08Z","lastTransitionTime":"2025-10-01T13:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.229696 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.229793 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.229845 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.230288 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.230355 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:08 crc kubenswrapper[4749]: E1001 13:07:08.230457 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.257778 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.257833 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.257848 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.257866 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.257881 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:08Z","lastTransitionTime":"2025-10-01T13:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.361385 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.361437 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.361452 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.361471 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.361483 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:08Z","lastTransitionTime":"2025-10-01T13:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.465308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.465392 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.465414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.465446 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.465473 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:08Z","lastTransitionTime":"2025-10-01T13:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.569180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.569309 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.569332 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.569361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.569379 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:08Z","lastTransitionTime":"2025-10-01T13:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.672417 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.672487 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.672505 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.672532 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.672548 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:08Z","lastTransitionTime":"2025-10-01T13:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.776203 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.776300 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.776325 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.776355 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.776379 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:08Z","lastTransitionTime":"2025-10-01T13:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.879409 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.879485 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.879508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.879540 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.879563 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:08Z","lastTransitionTime":"2025-10-01T13:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.983539 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.983978 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.984018 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.984185 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:08 crc kubenswrapper[4749]: I1001 13:07:08.984214 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:08Z","lastTransitionTime":"2025-10-01T13:07:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.087358 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.087433 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.087458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.087492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.087511 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:09Z","lastTransitionTime":"2025-10-01T13:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.189951 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.190041 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.190065 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.190103 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.190128 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:09Z","lastTransitionTime":"2025-10-01T13:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.229772 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:09 crc kubenswrapper[4749]: E1001 13:07:09.229962 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.293096 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.293144 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.293156 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.293170 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.293180 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:09Z","lastTransitionTime":"2025-10-01T13:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.396601 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.396659 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.396674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.396695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.396708 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:09Z","lastTransitionTime":"2025-10-01T13:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.500127 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.500197 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.500255 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.500308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.500330 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:09Z","lastTransitionTime":"2025-10-01T13:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.603949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.604037 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.604056 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.604086 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.604110 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:09Z","lastTransitionTime":"2025-10-01T13:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.707093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.707167 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.707192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.707260 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.707287 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:09Z","lastTransitionTime":"2025-10-01T13:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.809901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.809969 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.809988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.810018 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.810038 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:09Z","lastTransitionTime":"2025-10-01T13:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.913483 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.913555 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.913577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.913603 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:09 crc kubenswrapper[4749]: I1001 13:07:09.913622 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:09Z","lastTransitionTime":"2025-10-01T13:07:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.017031 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.017088 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.017102 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.017124 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.017138 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:10Z","lastTransitionTime":"2025-10-01T13:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.120346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.120465 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.120486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.120520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.120540 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:10Z","lastTransitionTime":"2025-10-01T13:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.224633 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.224713 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.224733 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.224764 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.224785 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:10Z","lastTransitionTime":"2025-10-01T13:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.228966 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.229010 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.229014 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:10 crc kubenswrapper[4749]: E1001 13:07:10.229196 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:10 crc kubenswrapper[4749]: E1001 13:07:10.229409 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:10 crc kubenswrapper[4749]: E1001 13:07:10.229577 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.328201 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.328304 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.328328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.328353 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.328373 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:10Z","lastTransitionTime":"2025-10-01T13:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.431574 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.431648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.431667 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.431698 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.431719 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:10Z","lastTransitionTime":"2025-10-01T13:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.534565 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.534622 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.534639 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.534665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.534683 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:10Z","lastTransitionTime":"2025-10-01T13:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.637475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.637567 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.637591 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.637622 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.637645 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:10Z","lastTransitionTime":"2025-10-01T13:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.741137 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.741239 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.741258 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.741286 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.741304 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:10Z","lastTransitionTime":"2025-10-01T13:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.844430 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.844491 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.844508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.844540 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.844560 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:10Z","lastTransitionTime":"2025-10-01T13:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.947704 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.947771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.947789 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.947817 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:10 crc kubenswrapper[4749]: I1001 13:07:10.947835 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:10Z","lastTransitionTime":"2025-10-01T13:07:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.051973 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.052055 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.052074 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.052104 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.052122 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:11Z","lastTransitionTime":"2025-10-01T13:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.155623 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.155693 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.155715 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.155744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.155761 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:11Z","lastTransitionTime":"2025-10-01T13:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.228972 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:11 crc kubenswrapper[4749]: E1001 13:07:11.229183 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.250442 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.263294 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.263372 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.263394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.263431 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.263453 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:11Z","lastTransitionTime":"2025-10-01T13:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.267291 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.281785 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-6hrtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94d6a9bc-756a-41ca-9ce3-44b6fc834a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6512eebd5ceb32b9b0dca4387aa9bbcb498703e9796e6494834961301d280e54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gwsdj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-6hrtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.297378 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c763aedc-e75b-471c-83d7-2c9a87da1aaf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://785a1485fe2ded29e24312f2363810b756ae80d031702bc455139154829343c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pchmz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4tfdz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.320996 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aee52150-78bb-49e5-a5ea-dd237863c810\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://954f643fec5c81ce0d33ad96a63826ca1217cbe501e16e4d4fb39fbcf16b98a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://26afe7d2c3dea7898b8ad0a8ea076a21a5f208d352fcdd14a6379b59df9fa767\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d8690528bbeba047746b02f446cd389987482412c76fef287be8e841cdce26ea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08e835219980c16f9d47dbcce70f9d82a0363f8c430f03bd9c9655b481f77a97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f1245e9af45c084de97cd26e4c5d6d4d2e96f8190c548eb6f519c40a75dc327\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a08369eed08c52f500cdb15eeb72a6457e5080fe5115ff8c9c5c990c4796e75e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50d57bfbe544cd23d2e6abc146f528890de96fec0436f1695f70a2d4329e8332\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b7cdg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-8sqjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.352056 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"28b7a551-68d7-4729-8a08-4a569d0bad73\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be14c63dea1cce48e3731afa02b0d07704effeab86c6e6b08a3f77a27bbb71f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://73279e287f604fa4413a91df4eee81891edb98df85819ea69e92a361e9890651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dec62f6db9226536c3ccec594262b7126b195619993297cb2565162aeb026623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e63598a8d1a838554ebb7ccae59fa4d1f09e3575f619f9e3eb5b3596ec946670\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa79ccb9a135f819ed873b5e1d6831796fbc90d05b24f601ec013cc58a72b26a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://076eab3e635db41f567a2fd64be6d6303ea2d10fdd080cbb4350287d3c128ac0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68de6f9a4e1b35a86b7a95ac38b40d3ed8a92cbc7e2f13cc45adda629946dc86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6fec68d8743c52880a59a6b328fec372d357f30d5d2ec0695f97170c83daafda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.367427 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.367475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.367486 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.367508 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.367521 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:11Z","lastTransitionTime":"2025-10-01T13:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.370998 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c47bcc48-5e7d-498a-9e3a-4a6504700516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4016588a07515e4e06c4b8225659fe6a88d2d07372b3eaa3a73e7b3886fd9392\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d25967301e952c3bce3c09f04dba9fe5053b540dc48a97f85f65714b32ba4a33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b15caba67da1cefee306f373b7cf65fa7ddbc971f2ea775c7266cb2a08707130\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c658d842fe00b5109dc51477db57aab6c855a14b68a6d08b511ad0a4124978\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b96b060485f3299acd29c6b6b12cac93df13201939435a13c10e45ab56447a98\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 13:06:04.055590 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 13:06:04.055813 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 13:06:04.056815 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1138899352/tls.crt::/tmp/serving-cert-1138899352/tls.key\\\\\\\"\\\\nI1001 13:06:04.276894 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 13:06:04.281959 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 13:06:04.281979 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 13:06:04.282007 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 13:06:04.282012 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 13:06:04.286380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 13:06:04.286397 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286402 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 13:06:04.286407 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 13:06:04.286420 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 13:06:04.286423 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 13:06:04.286426 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1001 13:06:04.286582 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1001 13:06:04.288648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fe839ab34c5e59fe6457796a04837708b3bdf32a4fba2aa52ab513afc96c7dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:46Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9e9cb20fe821ab131821880a8586979552556cb0a50d9aecf4d80564de403517\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.395097 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6bd23ba88380aa71eeb95c29f2adf0896ce85553c722c2de69e73506904ee91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.410081 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2eccb66-7d60-4283-b37a-693319a6ac15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915a776839bc9d041901e3a0e0748d62400bb5a2937a9d081abda2b510d5fc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68424d824195c50677ecfbd88a505009615f95e86db6474fa65639cc3e84f86a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ec5d59eb3ea896a6763b20ded1ab1de5dfc99d8416f4011b68058f8552faef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7e899fb755d89e65cee486df4919bdaff9aba76ad251104c20d71ee77c3df61\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:05:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.426196 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-nrgp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33919a9e-1f0d-4127-915d-17d77d78853e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b38a7f70ee60a91fce57dde930befc389fc6cda29f06b5461084fb700a449da6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:06:52Z\\\",\\\"message\\\":\\\"2025-10-01T13:06:07+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f\\\\n2025-10-01T13:06:07+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9e97d056-f1ac-410f-8937-1fb5c04c8e8f to /host/opt/cni/bin/\\\\n2025-10-01T13:06:07Z [verbose] multus-daemon started\\\\n2025-10-01T13:06:07Z [verbose] Readiness Indicator file check\\\\n2025-10-01T13:06:52Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sdvrw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-multus\"/\"multus-nrgp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.446121 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T13:07:00Z\\\",\\\"message\\\":\\\"ndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 13:07:00.173251 6749 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:07:00.173313 6749 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 13:07:00.173294 6749 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 13:07:00.173447 6749 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1001 13:07:00.180856 6749 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1001 13:07:00.180890 6749 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1001 13:07:00.181036 6749 ovnkube.go:599] Stopped ovnkube\\\\nI1001 13:07:00.181123 6749 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 13:07:00.181338 6749 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T13:06:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T13:06:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lpcbh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-fgjjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.463809 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad4c39e4-a4e8-48b2-9e95-94d5e106257e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cce543fb9902b45c9b4e9c12371799dfadb2c7773820634563f5804106a5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c625bb0c69b185bd24e464f89f059cf4c89b257bf726b9b2067da972c182834b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z87wz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4wct8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.470629 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.470703 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.470727 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.470762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.470784 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:11Z","lastTransitionTime":"2025-10-01T13:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.483308 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c130221d-ace7-4862-9593-63bbaac50dfe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:05:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fc79fed85e2181077c7573f5ceece8bbcd77c13cd5bebd11297ebf827906f18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d934a7d5b04cc68acae739a500ee950b823240ad83c76d4f970171b373e2dbf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b7cddcc8b101710b6b0bd0e61dfc92310a8a54fbee9df5af197af0b4ff7849c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8215ed4947e562d334f7ed96837264ddc6a414401f7a9a7d9e3ccc89a0351ee2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:05:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:05:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.500822 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:04Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.518027 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e911fbb7e14ca1b0a0ad348dd64c050a09db07f0c6000ba71a63be010d653b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://020246ac30f787dbad920b92455c45df7cdeeaf5ab7dcb95e97dae15b0b548f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.536490 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7418bb4e2a6bc0551b7bb9b1a310da2609d070f641680dfbc1411eaede0c0953\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.550747 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xcxs2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"027ade0f-680f-4066-8e28-d362fd24c84a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df25fc57fe336f4dbb641ab541fdefa56d9bff6eab630f0b2604c565625968ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T13:06:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jghf7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:07Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xcxs2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.567499 4749 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27497171-a8cc-4282-8ee6-2f68f768fc69\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T13:06:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d57dh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T13:06:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mwlpq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:11Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.573758 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.573809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.573829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.573854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.573871 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:11Z","lastTransitionTime":"2025-10-01T13:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.676909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.676970 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.676988 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.677016 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.677033 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:11Z","lastTransitionTime":"2025-10-01T13:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.779571 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.779660 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.779674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.779695 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.779711 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:11Z","lastTransitionTime":"2025-10-01T13:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.882547 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.882603 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.882615 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.882635 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.882653 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:11Z","lastTransitionTime":"2025-10-01T13:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.985522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.985590 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.985608 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.985633 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:11 crc kubenswrapper[4749]: I1001 13:07:11.985652 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:11Z","lastTransitionTime":"2025-10-01T13:07:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.088746 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.088824 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.088851 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.088901 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.088928 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.191117 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.191180 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.191198 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.191257 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.191281 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.229771 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.229857 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.229816 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:12 crc kubenswrapper[4749]: E1001 13:07:12.229981 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:12 crc kubenswrapper[4749]: E1001 13:07:12.230039 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:12 crc kubenswrapper[4749]: E1001 13:07:12.230145 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.293568 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.293625 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.293642 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.293667 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.293686 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.396159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.396459 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.396537 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.396632 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.396715 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.499869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.499950 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.499975 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.500010 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.500033 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.602680 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.602732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.602752 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.602780 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.602798 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.710962 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.711026 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.711045 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.711072 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.711090 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.813675 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.813738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.813762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.813794 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.813816 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.899340 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.899382 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.899393 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.899413 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.899424 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: E1001 13:07:12.915633 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.921813 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.921853 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.921865 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.921886 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.921900 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: E1001 13:07:12.942261 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.948611 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.948674 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.948691 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.948719 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.948740 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: E1001 13:07:12.965561 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.970919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.970989 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.971012 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.971075 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.971099 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:12 crc kubenswrapper[4749]: E1001 13:07:12.988894 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:12Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.993764 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.993832 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.993852 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.993880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:12 crc kubenswrapper[4749]: I1001 13:07:12.993900 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:12Z","lastTransitionTime":"2025-10-01T13:07:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:13 crc kubenswrapper[4749]: E1001 13:07:13.009089 4749 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T13:07:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0a611ef5-4004-4e1a-b60a-007b3d7463fd\\\",\\\"systemUUID\\\":\\\"a6812da7-649f-44b0-a677-765984715a01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T13:07:13Z is after 2025-08-24T17:21:41Z" Oct 01 13:07:13 crc kubenswrapper[4749]: E1001 13:07:13.009337 4749 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.012637 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.012699 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.012720 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.012746 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.012764 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:13Z","lastTransitionTime":"2025-10-01T13:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.115587 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.115635 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.115649 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.115670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.115701 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:13Z","lastTransitionTime":"2025-10-01T13:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.218350 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.218428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.218452 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.218484 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.218508 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:13Z","lastTransitionTime":"2025-10-01T13:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.229827 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:13 crc kubenswrapper[4749]: E1001 13:07:13.230035 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.320926 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.320991 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.321009 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.321032 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.321050 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:13Z","lastTransitionTime":"2025-10-01T13:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.424756 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.424840 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.424858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.424888 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.424907 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:13Z","lastTransitionTime":"2025-10-01T13:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.528161 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.528288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.528309 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.528334 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.528351 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:13Z","lastTransitionTime":"2025-10-01T13:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.631460 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.631546 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.631575 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.631608 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.631629 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:13Z","lastTransitionTime":"2025-10-01T13:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.734858 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.734920 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.734947 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.734976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.734996 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:13Z","lastTransitionTime":"2025-10-01T13:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.838413 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.838482 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.838499 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.838529 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.838552 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:13Z","lastTransitionTime":"2025-10-01T13:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.941606 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.941688 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.941714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.941747 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:13 crc kubenswrapper[4749]: I1001 13:07:13.941773 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:13Z","lastTransitionTime":"2025-10-01T13:07:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.044841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.044919 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.044942 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.044976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.045002 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:14Z","lastTransitionTime":"2025-10-01T13:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.147938 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.148271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.148447 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.148625 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.148653 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:14Z","lastTransitionTime":"2025-10-01T13:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.229296 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.229354 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.229594 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:14 crc kubenswrapper[4749]: E1001 13:07:14.229799 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:14 crc kubenswrapper[4749]: E1001 13:07:14.229889 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:14 crc kubenswrapper[4749]: E1001 13:07:14.229947 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.246813 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.252829 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.252898 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.252921 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.252948 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.252968 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:14Z","lastTransitionTime":"2025-10-01T13:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.355646 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.355722 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.355742 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.355771 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.355788 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:14Z","lastTransitionTime":"2025-10-01T13:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.458531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.458587 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.458605 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.458632 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.458649 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:14Z","lastTransitionTime":"2025-10-01T13:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.561146 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.561201 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.561244 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.561271 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.561289 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:14Z","lastTransitionTime":"2025-10-01T13:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.664466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.664524 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.664541 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.664567 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.664585 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:14Z","lastTransitionTime":"2025-10-01T13:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.768268 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.768378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.768398 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.768418 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.768436 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:14Z","lastTransitionTime":"2025-10-01T13:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.871573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.871672 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.871692 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.871716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.871733 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:14Z","lastTransitionTime":"2025-10-01T13:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.974915 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.974982 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.974999 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.975025 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:14 crc kubenswrapper[4749]: I1001 13:07:14.975042 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:14Z","lastTransitionTime":"2025-10-01T13:07:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.077477 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.077755 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.077766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.077784 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.077795 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:15Z","lastTransitionTime":"2025-10-01T13:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.180406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.180453 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.180462 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.180483 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.180493 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:15Z","lastTransitionTime":"2025-10-01T13:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.229143 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:15 crc kubenswrapper[4749]: E1001 13:07:15.229300 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.282724 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.282751 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.282760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.282775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.282784 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:15Z","lastTransitionTime":"2025-10-01T13:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.385757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.385801 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.385820 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.385841 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.385860 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:15Z","lastTransitionTime":"2025-10-01T13:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.488683 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.488756 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.488768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.488787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.488800 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:15Z","lastTransitionTime":"2025-10-01T13:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.592249 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.592327 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.592347 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.592377 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.592398 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:15Z","lastTransitionTime":"2025-10-01T13:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.695330 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.695378 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.695395 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.695414 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.695427 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:15Z","lastTransitionTime":"2025-10-01T13:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.798648 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.798712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.798730 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.798760 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.798777 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:15Z","lastTransitionTime":"2025-10-01T13:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.901773 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.901831 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.901851 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.901876 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:15 crc kubenswrapper[4749]: I1001 13:07:15.901894 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:15Z","lastTransitionTime":"2025-10-01T13:07:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.004792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.004913 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.004928 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.004949 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.004961 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:16Z","lastTransitionTime":"2025-10-01T13:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.107676 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.107735 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.107756 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.107785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.107802 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:16Z","lastTransitionTime":"2025-10-01T13:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.210664 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.210744 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.210770 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.210807 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.210826 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:16Z","lastTransitionTime":"2025-10-01T13:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.229809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.229867 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.229931 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:16 crc kubenswrapper[4749]: E1001 13:07:16.230202 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:16 crc kubenswrapper[4749]: E1001 13:07:16.230368 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:16 crc kubenswrapper[4749]: E1001 13:07:16.230567 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.313931 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.314002 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.314022 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.314052 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.314074 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:16Z","lastTransitionTime":"2025-10-01T13:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.417207 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.417308 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.417328 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.417357 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.417378 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:16Z","lastTransitionTime":"2025-10-01T13:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.524560 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.524626 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.524665 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.524699 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.524726 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:16Z","lastTransitionTime":"2025-10-01T13:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.627597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.627663 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.627681 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.627712 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.627736 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:16Z","lastTransitionTime":"2025-10-01T13:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.730968 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.731033 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.731050 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.731077 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.731104 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:16Z","lastTransitionTime":"2025-10-01T13:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.834316 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.834409 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.834429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.834488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.834507 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:16Z","lastTransitionTime":"2025-10-01T13:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.936862 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.937280 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.937499 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.937657 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:16 crc kubenswrapper[4749]: I1001 13:07:16.937775 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:16Z","lastTransitionTime":"2025-10-01T13:07:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.041933 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.042346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.042519 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.042666 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.042806 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:17Z","lastTransitionTime":"2025-10-01T13:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.145670 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.145718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.145736 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.145762 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.145780 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:17Z","lastTransitionTime":"2025-10-01T13:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.229930 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:17 crc kubenswrapper[4749]: E1001 13:07:17.230121 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.232483 4749 scope.go:117] "RemoveContainer" containerID="f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743" Oct 01 13:07:17 crc kubenswrapper[4749]: E1001 13:07:17.232896 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.247890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.247953 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.247972 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.247997 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.248015 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:17Z","lastTransitionTime":"2025-10-01T13:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.351880 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.352333 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.352429 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.352520 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.352594 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:17Z","lastTransitionTime":"2025-10-01T13:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.455600 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.455658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.455676 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.455702 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.455721 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:17Z","lastTransitionTime":"2025-10-01T13:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.559428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.559717 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.559882 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.560011 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.560138 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:17Z","lastTransitionTime":"2025-10-01T13:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.664798 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.664881 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.664895 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.664917 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.664935 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:17Z","lastTransitionTime":"2025-10-01T13:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.767497 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.767559 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.767577 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.767604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.767622 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:17Z","lastTransitionTime":"2025-10-01T13:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.870387 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.870475 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.870503 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.870575 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.870600 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:17Z","lastTransitionTime":"2025-10-01T13:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.973408 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.973463 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.973477 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.973500 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:17 crc kubenswrapper[4749]: I1001 13:07:17.973515 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:17Z","lastTransitionTime":"2025-10-01T13:07:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.076347 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.076432 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.076458 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.076490 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.076511 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:18Z","lastTransitionTime":"2025-10-01T13:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.179766 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.179837 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.179855 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.179885 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.179903 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:18Z","lastTransitionTime":"2025-10-01T13:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.229810 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.229810 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:18 crc kubenswrapper[4749]: E1001 13:07:18.230029 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.229819 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:18 crc kubenswrapper[4749]: E1001 13:07:18.230179 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:18 crc kubenswrapper[4749]: E1001 13:07:18.230367 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.282659 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.282715 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.282733 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.282756 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.282775 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:18Z","lastTransitionTime":"2025-10-01T13:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.385205 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.385288 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.385307 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.385331 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.385354 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:18Z","lastTransitionTime":"2025-10-01T13:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.487694 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.487741 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.487765 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.487794 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.487816 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:18Z","lastTransitionTime":"2025-10-01T13:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.590718 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.590768 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.590787 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.590809 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.590827 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:18Z","lastTransitionTime":"2025-10-01T13:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.694282 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.694349 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.694376 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.694404 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.694426 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:18Z","lastTransitionTime":"2025-10-01T13:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.797256 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.797315 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.797335 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.797362 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.797384 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:18Z","lastTransitionTime":"2025-10-01T13:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.900890 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.901323 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.901518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.901732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:18 crc kubenswrapper[4749]: I1001 13:07:18.901893 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:18Z","lastTransitionTime":"2025-10-01T13:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.004551 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.004614 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.004632 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.004658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.004676 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:19Z","lastTransitionTime":"2025-10-01T13:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.108093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.108199 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.108304 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.108341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.108361 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:19Z","lastTransitionTime":"2025-10-01T13:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.211060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.211119 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.211136 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.211160 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.211177 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:19Z","lastTransitionTime":"2025-10-01T13:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.228985 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:19 crc kubenswrapper[4749]: E1001 13:07:19.229180 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.314058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.314134 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.314159 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.314192 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.314245 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:19Z","lastTransitionTime":"2025-10-01T13:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.417998 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.418064 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.418085 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.418112 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.418129 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:19Z","lastTransitionTime":"2025-10-01T13:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.521393 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.521481 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.521504 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.521535 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.521556 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:19Z","lastTransitionTime":"2025-10-01T13:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.624468 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.624516 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.624534 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.624562 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.624581 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:19Z","lastTransitionTime":"2025-10-01T13:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.727538 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.727593 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.727604 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.727622 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.727635 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:19Z","lastTransitionTime":"2025-10-01T13:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.830518 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.830579 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.830597 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.830621 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.830638 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:19Z","lastTransitionTime":"2025-10-01T13:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.933466 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.933553 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.933669 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.933738 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:19 crc kubenswrapper[4749]: I1001 13:07:19.933765 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:19Z","lastTransitionTime":"2025-10-01T13:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.037779 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.037851 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.037879 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.037914 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.037946 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:20Z","lastTransitionTime":"2025-10-01T13:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.142658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.142715 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.142732 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.142757 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.142776 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:20Z","lastTransitionTime":"2025-10-01T13:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.229360 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.229388 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:20 crc kubenswrapper[4749]: E1001 13:07:20.229529 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.229724 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:20 crc kubenswrapper[4749]: E1001 13:07:20.230278 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:20 crc kubenswrapper[4749]: E1001 13:07:20.230492 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.245619 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.245679 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.245700 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.245729 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.245750 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:20Z","lastTransitionTime":"2025-10-01T13:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.348641 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.348716 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.348742 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.348772 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.348793 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:20Z","lastTransitionTime":"2025-10-01T13:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.451721 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.451775 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.451792 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.451815 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.451831 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:20Z","lastTransitionTime":"2025-10-01T13:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.554444 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.554492 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.554586 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.554609 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.554626 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:20Z","lastTransitionTime":"2025-10-01T13:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.657986 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.658038 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.658058 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.658083 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.658100 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:20Z","lastTransitionTime":"2025-10-01T13:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.761472 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.761531 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.761550 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.761573 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.761590 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:20Z","lastTransitionTime":"2025-10-01T13:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.864777 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.864864 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.864891 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.864924 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.864946 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:20Z","lastTransitionTime":"2025-10-01T13:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.968296 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.968368 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.968394 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.968428 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:20 crc kubenswrapper[4749]: I1001 13:07:20.968453 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:20Z","lastTransitionTime":"2025-10-01T13:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.071139 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.071189 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.071207 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.071346 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.071404 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:21Z","lastTransitionTime":"2025-10-01T13:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.174406 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.174455 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.174469 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.174488 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.174502 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:21Z","lastTransitionTime":"2025-10-01T13:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.229821 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:21 crc kubenswrapper[4749]: E1001 13:07:21.230128 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.278299 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.278869 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.278905 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.278937 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.278961 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:21Z","lastTransitionTime":"2025-10-01T13:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.283572 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=76.283544547 podStartE2EDuration="1m16.283544547s" podCreationTimestamp="2025-10-01 13:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:21.281788936 +0000 UTC m=+101.335773875" watchObservedRunningTime="2025-10-01 13:07:21.283544547 +0000 UTC m=+101.337529486" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.340645 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.340625519 podStartE2EDuration="1m17.340625519s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:21.309397541 +0000 UTC m=+101.363382470" watchObservedRunningTime="2025-10-01 13:07:21.340625519 +0000 UTC m=+101.394610418" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.381060 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.381093 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.381101 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.381114 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.381133 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:21Z","lastTransitionTime":"2025-10-01T13:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.382023 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.38201128 podStartE2EDuration="7.38201128s" podCreationTimestamp="2025-10-01 13:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:21.360493751 +0000 UTC m=+101.414478650" watchObservedRunningTime="2025-10-01 13:07:21.38201128 +0000 UTC m=+101.435996179" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.401619 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=48.401601434 podStartE2EDuration="48.401601434s" podCreationTimestamp="2025-10-01 13:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:21.382543645 +0000 UTC m=+101.436528544" watchObservedRunningTime="2025-10-01 13:07:21.401601434 +0000 UTC m=+101.455586343" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.433952 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nrgp7" podStartSLOduration=77.433924994 podStartE2EDuration="1m17.433924994s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:21.401897342 +0000 UTC m=+101.455882241" watchObservedRunningTime="2025-10-01 13:07:21.433924994 +0000 UTC m=+101.487909923" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.448402 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4wct8" podStartSLOduration=76.448371299 podStartE2EDuration="1m16.448371299s" podCreationTimestamp="2025-10-01 13:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:21.448022749 +0000 UTC m=+101.502007688" watchObservedRunningTime="2025-10-01 13:07:21.448371299 +0000 UTC m=+101.502356238" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.465384 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=77.465358548 podStartE2EDuration="1m17.465358548s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:21.463936477 +0000 UTC m=+101.517921386" watchObservedRunningTime="2025-10-01 13:07:21.465358548 +0000 UTC m=+101.519343477" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.483658 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.483704 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.483714 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.483730 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.483740 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:21Z","lastTransitionTime":"2025-10-01T13:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.544972 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xcxs2" podStartSLOduration=77.544952148 podStartE2EDuration="1m17.544952148s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:21.521434111 +0000 UTC m=+101.575419050" watchObservedRunningTime="2025-10-01 13:07:21.544952148 +0000 UTC m=+101.598937047" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.586141 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.586386 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.586448 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.586516 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.586609 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:21Z","lastTransitionTime":"2025-10-01T13:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.596680 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6hrtf" podStartSLOduration=77.596652406 podStartE2EDuration="1m17.596652406s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:21.595099221 +0000 UTC m=+101.649084170" watchObservedRunningTime="2025-10-01 13:07:21.596652406 +0000 UTC m=+101.650637335" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.613284 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podStartSLOduration=77.613261784 podStartE2EDuration="1m17.613261784s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:21.61069598 +0000 UTC m=+101.664680879" watchObservedRunningTime="2025-10-01 13:07:21.613261784 +0000 UTC m=+101.667246723" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.631440 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8sqjb" podStartSLOduration=77.631415056 podStartE2EDuration="1m17.631415056s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:21.631262511 +0000 UTC m=+101.685247410" watchObservedRunningTime="2025-10-01 13:07:21.631415056 +0000 UTC m=+101.685399985" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.689181 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.689301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.689321 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.689344 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.689364 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:21Z","lastTransitionTime":"2025-10-01T13:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.792278 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.792341 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.792361 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.792386 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.792404 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:21Z","lastTransitionTime":"2025-10-01T13:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.894821 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.894958 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.894985 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.895015 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.895034 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:21Z","lastTransitionTime":"2025-10-01T13:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.998301 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.998377 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.998397 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.998435 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:21 crc kubenswrapper[4749]: I1001 13:07:21.998454 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:21Z","lastTransitionTime":"2025-10-01T13:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.101908 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.101961 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.101979 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.102003 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.102019 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:22Z","lastTransitionTime":"2025-10-01T13:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.205172 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.205230 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.205241 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.205255 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.205264 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:22Z","lastTransitionTime":"2025-10-01T13:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.228933 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.228997 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:22 crc kubenswrapper[4749]: E1001 13:07:22.229085 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.229024 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:22 crc kubenswrapper[4749]: E1001 13:07:22.229267 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:22 crc kubenswrapper[4749]: E1001 13:07:22.229285 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.307864 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.308211 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.308393 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.308522 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.308644 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:22Z","lastTransitionTime":"2025-10-01T13:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.411794 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.411856 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.411874 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.411900 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.411920 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:22Z","lastTransitionTime":"2025-10-01T13:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.514785 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.514857 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.514877 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.514903 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.514921 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:22Z","lastTransitionTime":"2025-10-01T13:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.617922 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.617976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.617996 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.618021 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.618042 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:22Z","lastTransitionTime":"2025-10-01T13:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.721909 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.721976 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.721993 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.722018 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.722034 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:22Z","lastTransitionTime":"2025-10-01T13:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.824806 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.824854 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.824868 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.824884 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.824896 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:22Z","lastTransitionTime":"2025-10-01T13:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.928290 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.928336 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.928348 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.928366 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:22 crc kubenswrapper[4749]: I1001 13:07:22.928378 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:22Z","lastTransitionTime":"2025-10-01T13:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.030541 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.030575 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.030584 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.030598 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.030607 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:23Z","lastTransitionTime":"2025-10-01T13:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.045589 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.045629 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.045638 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.045657 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.045666 4749 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T13:07:23Z","lastTransitionTime":"2025-10-01T13:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.103872 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk"] Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.104535 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.106633 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.106759 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.107297 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.111645 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.229619 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:23 crc kubenswrapper[4749]: E1001 13:07:23.229848 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.230023 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.230368 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.230467 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.230600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.230674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.331933 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.331998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.332038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.332136 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.332364 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.332368 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.332424 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.333956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.343166 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.350555 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/583bb0c8-4d2c-4516-ab4d-ea9f6e079eba-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t4rlk\" (UID: \"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.428845 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" Oct 01 13:07:23 crc kubenswrapper[4749]: W1001 13:07:23.453355 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod583bb0c8_4d2c_4516_ab4d_ea9f6e079eba.slice/crio-303025815f802e7f2443af9aafc4c0908d052b4969b5c009c9367682ae225dd3 WatchSource:0}: Error finding container 303025815f802e7f2443af9aafc4c0908d052b4969b5c009c9367682ae225dd3: Status 404 returned error can't find the container with id 303025815f802e7f2443af9aafc4c0908d052b4969b5c009c9367682ae225dd3 Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.636847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:23 crc kubenswrapper[4749]: E1001 13:07:23.637047 4749 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:07:23 crc kubenswrapper[4749]: E1001 13:07:23.637164 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs podName:27497171-a8cc-4282-8ee6-2f68f768fc69 nodeName:}" failed. No retries permitted until 2025-10-01 13:08:27.637137155 +0000 UTC m=+167.691122084 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs") pod "network-metrics-daemon-mwlpq" (UID: "27497171-a8cc-4282-8ee6-2f68f768fc69") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.943405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" event={"ID":"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba","Type":"ContainerStarted","Data":"135ff4ffd6b464947736b0f620e0184db9515b0ada2b6a180924f35437964c1d"} Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.943493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" event={"ID":"583bb0c8-4d2c-4516-ab4d-ea9f6e079eba","Type":"ContainerStarted","Data":"303025815f802e7f2443af9aafc4c0908d052b4969b5c009c9367682ae225dd3"} Oct 01 13:07:23 crc kubenswrapper[4749]: I1001 13:07:23.967190 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t4rlk" podStartSLOduration=79.96716568 podStartE2EDuration="1m19.96716568s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:23.965015968 +0000 UTC m=+104.019000907" watchObservedRunningTime="2025-10-01 13:07:23.96716568 +0000 UTC m=+104.021150609" Oct 01 13:07:24 crc kubenswrapper[4749]: I1001 13:07:24.229821 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:24 crc kubenswrapper[4749]: I1001 13:07:24.229877 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:24 crc kubenswrapper[4749]: I1001 13:07:24.229883 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:24 crc kubenswrapper[4749]: E1001 13:07:24.229984 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:24 crc kubenswrapper[4749]: E1001 13:07:24.230158 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:24 crc kubenswrapper[4749]: E1001 13:07:24.230364 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:25 crc kubenswrapper[4749]: I1001 13:07:25.229295 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:25 crc kubenswrapper[4749]: E1001 13:07:25.229796 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:26 crc kubenswrapper[4749]: I1001 13:07:26.228921 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:26 crc kubenswrapper[4749]: I1001 13:07:26.228931 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:26 crc kubenswrapper[4749]: I1001 13:07:26.228921 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:26 crc kubenswrapper[4749]: E1001 13:07:26.229091 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:26 crc kubenswrapper[4749]: E1001 13:07:26.229400 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:26 crc kubenswrapper[4749]: E1001 13:07:26.229511 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:27 crc kubenswrapper[4749]: I1001 13:07:27.229558 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:27 crc kubenswrapper[4749]: E1001 13:07:27.229781 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:28 crc kubenswrapper[4749]: I1001 13:07:28.229503 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:28 crc kubenswrapper[4749]: E1001 13:07:28.229724 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:28 crc kubenswrapper[4749]: I1001 13:07:28.230107 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:28 crc kubenswrapper[4749]: E1001 13:07:28.230278 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:28 crc kubenswrapper[4749]: I1001 13:07:28.231799 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:28 crc kubenswrapper[4749]: E1001 13:07:28.232056 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:29 crc kubenswrapper[4749]: I1001 13:07:29.229088 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:29 crc kubenswrapper[4749]: E1001 13:07:29.229335 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:30 crc kubenswrapper[4749]: I1001 13:07:30.228772 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:30 crc kubenswrapper[4749]: I1001 13:07:30.228849 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:30 crc kubenswrapper[4749]: E1001 13:07:30.229635 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:30 crc kubenswrapper[4749]: I1001 13:07:30.229091 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:30 crc kubenswrapper[4749]: E1001 13:07:30.229741 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:30 crc kubenswrapper[4749]: E1001 13:07:30.229945 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:31 crc kubenswrapper[4749]: I1001 13:07:31.229456 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:31 crc kubenswrapper[4749]: E1001 13:07:31.232088 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:32 crc kubenswrapper[4749]: I1001 13:07:32.228915 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:32 crc kubenswrapper[4749]: E1001 13:07:32.229449 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:32 crc kubenswrapper[4749]: I1001 13:07:32.229107 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:32 crc kubenswrapper[4749]: I1001 13:07:32.229023 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:32 crc kubenswrapper[4749]: E1001 13:07:32.230281 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:32 crc kubenswrapper[4749]: E1001 13:07:32.230494 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:32 crc kubenswrapper[4749]: I1001 13:07:32.230743 4749 scope.go:117] "RemoveContainer" containerID="f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743" Oct 01 13:07:32 crc kubenswrapper[4749]: E1001 13:07:32.231027 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-fgjjp_openshift-ovn-kubernetes(f6a678bb-9e17-4b2a-bef9-dea34bc3c218)\"" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" Oct 01 13:07:33 crc kubenswrapper[4749]: I1001 13:07:33.229627 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:33 crc kubenswrapper[4749]: E1001 13:07:33.229856 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:34 crc kubenswrapper[4749]: I1001 13:07:34.229711 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:34 crc kubenswrapper[4749]: I1001 13:07:34.229723 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:34 crc kubenswrapper[4749]: E1001 13:07:34.229969 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:34 crc kubenswrapper[4749]: I1001 13:07:34.229885 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:34 crc kubenswrapper[4749]: E1001 13:07:34.230114 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:34 crc kubenswrapper[4749]: E1001 13:07:34.230205 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:35 crc kubenswrapper[4749]: I1001 13:07:35.230078 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:35 crc kubenswrapper[4749]: E1001 13:07:35.230345 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:36 crc kubenswrapper[4749]: I1001 13:07:36.229006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:36 crc kubenswrapper[4749]: I1001 13:07:36.229200 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:36 crc kubenswrapper[4749]: E1001 13:07:36.229324 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:36 crc kubenswrapper[4749]: I1001 13:07:36.229423 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:36 crc kubenswrapper[4749]: E1001 13:07:36.229496 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:36 crc kubenswrapper[4749]: E1001 13:07:36.229631 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:37 crc kubenswrapper[4749]: I1001 13:07:37.229790 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:37 crc kubenswrapper[4749]: E1001 13:07:37.230026 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:38 crc kubenswrapper[4749]: I1001 13:07:38.229462 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:38 crc kubenswrapper[4749]: I1001 13:07:38.229548 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:38 crc kubenswrapper[4749]: E1001 13:07:38.229628 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:38 crc kubenswrapper[4749]: I1001 13:07:38.229556 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:38 crc kubenswrapper[4749]: E1001 13:07:38.229777 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:38 crc kubenswrapper[4749]: E1001 13:07:38.229980 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:39 crc kubenswrapper[4749]: I1001 13:07:39.004977 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrgp7_33919a9e-1f0d-4127-915d-17d77d78853e/kube-multus/1.log" Oct 01 13:07:39 crc kubenswrapper[4749]: I1001 13:07:39.005930 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrgp7_33919a9e-1f0d-4127-915d-17d77d78853e/kube-multus/0.log" Oct 01 13:07:39 crc kubenswrapper[4749]: I1001 13:07:39.006016 4749 generic.go:334] "Generic (PLEG): container finished" podID="33919a9e-1f0d-4127-915d-17d77d78853e" containerID="b38a7f70ee60a91fce57dde930befc389fc6cda29f06b5461084fb700a449da6" exitCode=1 Oct 01 13:07:39 crc kubenswrapper[4749]: I1001 13:07:39.006074 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrgp7" event={"ID":"33919a9e-1f0d-4127-915d-17d77d78853e","Type":"ContainerDied","Data":"b38a7f70ee60a91fce57dde930befc389fc6cda29f06b5461084fb700a449da6"} Oct 01 13:07:39 crc kubenswrapper[4749]: I1001 13:07:39.006249 4749 scope.go:117] "RemoveContainer" containerID="d3a6032f2d76514f6410b55dee878227e39bd64f86bcdd46595ff79371550233" Oct 01 13:07:39 crc kubenswrapper[4749]: I1001 13:07:39.006881 4749 scope.go:117] "RemoveContainer" containerID="b38a7f70ee60a91fce57dde930befc389fc6cda29f06b5461084fb700a449da6" Oct 01 13:07:39 crc kubenswrapper[4749]: E1001 13:07:39.007155 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-nrgp7_openshift-multus(33919a9e-1f0d-4127-915d-17d77d78853e)\"" pod="openshift-multus/multus-nrgp7" podUID="33919a9e-1f0d-4127-915d-17d77d78853e" Oct 01 13:07:39 crc kubenswrapper[4749]: I1001 13:07:39.229610 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:39 crc kubenswrapper[4749]: E1001 13:07:39.229789 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:40 crc kubenswrapper[4749]: I1001 13:07:40.012189 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrgp7_33919a9e-1f0d-4127-915d-17d77d78853e/kube-multus/1.log" Oct 01 13:07:40 crc kubenswrapper[4749]: I1001 13:07:40.229200 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:40 crc kubenswrapper[4749]: I1001 13:07:40.229300 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:40 crc kubenswrapper[4749]: E1001 13:07:40.229448 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:40 crc kubenswrapper[4749]: I1001 13:07:40.229465 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:40 crc kubenswrapper[4749]: E1001 13:07:40.229527 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:40 crc kubenswrapper[4749]: E1001 13:07:40.229594 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:41 crc kubenswrapper[4749]: E1001 13:07:41.193163 4749 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 01 13:07:41 crc kubenswrapper[4749]: I1001 13:07:41.229528 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:41 crc kubenswrapper[4749]: E1001 13:07:41.231293 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:41 crc kubenswrapper[4749]: E1001 13:07:41.374992 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 13:07:42 crc kubenswrapper[4749]: I1001 13:07:42.229473 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:42 crc kubenswrapper[4749]: I1001 13:07:42.229518 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:42 crc kubenswrapper[4749]: I1001 13:07:42.229578 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:42 crc kubenswrapper[4749]: E1001 13:07:42.229771 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:42 crc kubenswrapper[4749]: E1001 13:07:42.229920 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:42 crc kubenswrapper[4749]: E1001 13:07:42.230009 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:43 crc kubenswrapper[4749]: I1001 13:07:43.229652 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:43 crc kubenswrapper[4749]: E1001 13:07:43.229833 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:44 crc kubenswrapper[4749]: I1001 13:07:44.229534 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:44 crc kubenswrapper[4749]: I1001 13:07:44.229534 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:44 crc kubenswrapper[4749]: I1001 13:07:44.229663 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:44 crc kubenswrapper[4749]: E1001 13:07:44.229709 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:44 crc kubenswrapper[4749]: E1001 13:07:44.229918 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:44 crc kubenswrapper[4749]: E1001 13:07:44.230555 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:44 crc kubenswrapper[4749]: I1001 13:07:44.230876 4749 scope.go:117] "RemoveContainer" containerID="f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743" Oct 01 13:07:45 crc kubenswrapper[4749]: I1001 13:07:45.032685 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/3.log" Oct 01 13:07:45 crc kubenswrapper[4749]: I1001 13:07:45.036925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerStarted","Data":"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5"} Oct 01 13:07:45 crc kubenswrapper[4749]: I1001 13:07:45.037668 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:07:45 crc kubenswrapper[4749]: I1001 13:07:45.079764 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podStartSLOduration=101.079744818 podStartE2EDuration="1m41.079744818s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:07:45.076383471 +0000 UTC m=+125.130368420" watchObservedRunningTime="2025-10-01 13:07:45.079744818 +0000 UTC m=+125.133729757" Oct 01 13:07:45 crc kubenswrapper[4749]: I1001 13:07:45.230156 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:45 crc kubenswrapper[4749]: E1001 13:07:45.230430 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:45 crc kubenswrapper[4749]: I1001 13:07:45.252116 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mwlpq"] Oct 01 13:07:46 crc kubenswrapper[4749]: I1001 13:07:46.040435 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:46 crc kubenswrapper[4749]: E1001 13:07:46.040881 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:46 crc kubenswrapper[4749]: I1001 13:07:46.229574 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:46 crc kubenswrapper[4749]: I1001 13:07:46.229636 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:46 crc kubenswrapper[4749]: E1001 13:07:46.229774 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:46 crc kubenswrapper[4749]: E1001 13:07:46.229894 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:46 crc kubenswrapper[4749]: I1001 13:07:46.229988 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:46 crc kubenswrapper[4749]: E1001 13:07:46.230119 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:46 crc kubenswrapper[4749]: E1001 13:07:46.376639 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 13:07:48 crc kubenswrapper[4749]: I1001 13:07:48.228797 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:48 crc kubenswrapper[4749]: I1001 13:07:48.228864 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:48 crc kubenswrapper[4749]: E1001 13:07:48.229329 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:48 crc kubenswrapper[4749]: I1001 13:07:48.228982 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:48 crc kubenswrapper[4749]: E1001 13:07:48.229479 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:48 crc kubenswrapper[4749]: I1001 13:07:48.228918 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:48 crc kubenswrapper[4749]: E1001 13:07:48.229622 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:48 crc kubenswrapper[4749]: E1001 13:07:48.229715 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:50 crc kubenswrapper[4749]: I1001 13:07:50.228977 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:50 crc kubenswrapper[4749]: I1001 13:07:50.229054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:50 crc kubenswrapper[4749]: I1001 13:07:50.229095 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:50 crc kubenswrapper[4749]: I1001 13:07:50.228989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:50 crc kubenswrapper[4749]: E1001 13:07:50.229257 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:50 crc kubenswrapper[4749]: E1001 13:07:50.229400 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:50 crc kubenswrapper[4749]: E1001 13:07:50.229552 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:50 crc kubenswrapper[4749]: E1001 13:07:50.229721 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:51 crc kubenswrapper[4749]: I1001 13:07:51.231849 4749 scope.go:117] "RemoveContainer" containerID="b38a7f70ee60a91fce57dde930befc389fc6cda29f06b5461084fb700a449da6" Oct 01 13:07:51 crc kubenswrapper[4749]: E1001 13:07:51.377387 4749 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 13:07:52 crc kubenswrapper[4749]: I1001 13:07:52.065685 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrgp7_33919a9e-1f0d-4127-915d-17d77d78853e/kube-multus/1.log" Oct 01 13:07:52 crc kubenswrapper[4749]: I1001 13:07:52.065800 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrgp7" event={"ID":"33919a9e-1f0d-4127-915d-17d77d78853e","Type":"ContainerStarted","Data":"dafe8b3793369956dfb95caebe1bb42e3642c1ecebf130cd340a4642e7927aa7"} Oct 01 13:07:52 crc kubenswrapper[4749]: I1001 13:07:52.229884 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:52 crc kubenswrapper[4749]: I1001 13:07:52.229921 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:52 crc kubenswrapper[4749]: I1001 13:07:52.229892 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:52 crc kubenswrapper[4749]: I1001 13:07:52.229884 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:52 crc kubenswrapper[4749]: E1001 13:07:52.230019 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:52 crc kubenswrapper[4749]: E1001 13:07:52.230116 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:52 crc kubenswrapper[4749]: E1001 13:07:52.230276 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:52 crc kubenswrapper[4749]: E1001 13:07:52.230328 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:54 crc kubenswrapper[4749]: I1001 13:07:54.229686 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:54 crc kubenswrapper[4749]: E1001 13:07:54.229867 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:54 crc kubenswrapper[4749]: I1001 13:07:54.230147 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:54 crc kubenswrapper[4749]: E1001 13:07:54.230266 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:54 crc kubenswrapper[4749]: I1001 13:07:54.230548 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:54 crc kubenswrapper[4749]: E1001 13:07:54.230747 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:54 crc kubenswrapper[4749]: I1001 13:07:54.230547 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:54 crc kubenswrapper[4749]: E1001 13:07:54.230954 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:56 crc kubenswrapper[4749]: I1001 13:07:56.229520 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:56 crc kubenswrapper[4749]: I1001 13:07:56.229617 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:56 crc kubenswrapper[4749]: I1001 13:07:56.229618 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:56 crc kubenswrapper[4749]: E1001 13:07:56.229759 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mwlpq" podUID="27497171-a8cc-4282-8ee6-2f68f768fc69" Oct 01 13:07:56 crc kubenswrapper[4749]: E1001 13:07:56.229864 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 13:07:56 crc kubenswrapper[4749]: I1001 13:07:56.229941 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:56 crc kubenswrapper[4749]: E1001 13:07:56.230069 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 13:07:56 crc kubenswrapper[4749]: E1001 13:07:56.230158 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 13:07:58 crc kubenswrapper[4749]: I1001 13:07:58.229076 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:07:58 crc kubenswrapper[4749]: I1001 13:07:58.229107 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:07:58 crc kubenswrapper[4749]: I1001 13:07:58.229173 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:07:58 crc kubenswrapper[4749]: I1001 13:07:58.229335 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:07:58 crc kubenswrapper[4749]: I1001 13:07:58.232361 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 01 13:07:58 crc kubenswrapper[4749]: I1001 13:07:58.232606 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 01 13:07:58 crc kubenswrapper[4749]: I1001 13:07:58.232724 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 01 13:07:58 crc kubenswrapper[4749]: I1001 13:07:58.233455 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 01 13:07:58 crc kubenswrapper[4749]: I1001 13:07:58.233498 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 01 13:07:58 crc kubenswrapper[4749]: I1001 13:07:58.235055 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 01 13:07:59 crc kubenswrapper[4749]: I1001 13:07:59.562783 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.013145 4749 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.064893 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-w2nbb"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.065469 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.067893 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.068814 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.070865 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.071077 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.071097 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.071752 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.072095 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.072611 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xk4rn"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.073442 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.074136 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.074804 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.077801 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.078192 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.078869 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.087666 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.088162 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.088751 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.089199 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.089488 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.089728 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.090029 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.090055 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.090670 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.090816 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.092794 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.090741 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.098317 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2j6m9"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.098767 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2j6m9" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.101842 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.102804 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.107442 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.107673 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.107950 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.108173 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.108439 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.108625 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.108841 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.110586 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.114651 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.115935 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4zj5j"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.134140 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.135868 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x9q82"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.147832 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.151929 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.152658 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h699h"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.152674 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.152953 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.153117 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.153213 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.153436 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.153581 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.153610 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.153667 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.153781 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.153872 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.153981 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.154069 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.154323 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h699h" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.154308 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-d2qqz"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.154778 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.154890 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.154979 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.155002 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.155207 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.155659 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.156081 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.157774 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.160907 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.162071 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2sc4v"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.162295 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.162490 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.162575 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.162591 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.162806 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.163483 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2sc4v" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.164325 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.164606 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.165006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.167487 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ql7n8"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.168160 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.168408 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.171260 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.172072 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.172146 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.172246 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.172379 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.172283 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.172689 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.172761 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.176045 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.176842 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.176976 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqndp"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.177454 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.179102 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.179348 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.179512 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.179651 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.179724 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.179796 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.179912 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.179953 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.180027 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.180113 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.180187 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.193296 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.193637 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.194033 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.194594 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.194993 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.195432 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.196116 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.179920 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.198401 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.200410 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.200919 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.203509 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.204168 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227268 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227280 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227449 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227461 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6smx5"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227525 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227692 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227773 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dcec683-7e35-4aea-95da-b43baaf89e03-metrics-tls\") pod \"ingress-operator-5b745b69d9-7bfdt\" (UID: \"5dcec683-7e35-4aea-95da-b43baaf89e03\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227798 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227800 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227856 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-serving-cert\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd4h4\" (UniqueName: \"kubernetes.io/projected/dc3a6c0f-3591-44b3-9044-b219d6a69787-kube-api-access-qd4h4\") pod \"machine-approver-56656f9798-ztkh5\" (UID: \"dc3a6c0f-3591-44b3-9044-b219d6a69787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.227976 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnk9n\" (UniqueName: \"kubernetes.io/projected/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-kube-api-access-dnk9n\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rvvx\" (UniqueName: \"kubernetes.io/projected/43aae4a1-9504-4c81-9ff1-675c0b51ced2-kube-api-access-4rvvx\") pod \"downloads-7954f5f757-2j6m9\" (UID: \"43aae4a1-9504-4c81-9ff1-675c0b51ced2\") " pod="openshift-console/downloads-7954f5f757-2j6m9" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228026 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228043 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc3a6c0f-3591-44b3-9044-b219d6a69787-config\") pod \"machine-approver-56656f9798-ztkh5\" (UID: \"dc3a6c0f-3591-44b3-9044-b219d6a69787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228090 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dcec683-7e35-4aea-95da-b43baaf89e03-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7bfdt\" (UID: \"5dcec683-7e35-4aea-95da-b43baaf89e03\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228116 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbf033db-e8ba-43f1-bf88-287c3afc79e7-serving-cert\") pod \"console-operator-58897d9998-xk4rn\" (UID: \"dbf033db-e8ba-43f1-bf88-287c3afc79e7\") " pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228142 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228171 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2xc8\" (UniqueName: \"kubernetes.io/projected/5dcec683-7e35-4aea-95da-b43baaf89e03-kube-api-access-b2xc8\") pod \"ingress-operator-5b745b69d9-7bfdt\" (UID: \"5dcec683-7e35-4aea-95da-b43baaf89e03\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228190 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228204 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228236 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228255 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228266 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228316 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228406 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228437 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228470 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228553 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf033db-e8ba-43f1-bf88-287c3afc79e7-config\") pod \"console-operator-58897d9998-xk4rn\" (UID: \"dbf033db-e8ba-43f1-bf88-287c3afc79e7\") " pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228599 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228610 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-config\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228656 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228660 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg5q9\" (UniqueName: \"kubernetes.io/projected/a4a7417c-f1ec-4b88-9f86-8a4da4365429-kube-api-access-lg5q9\") pod \"cluster-samples-operator-665b6dd947-h64lc\" (UID: \"a4a7417c-f1ec-4b88-9f86-8a4da4365429\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/698ca3be-58a9-4c4a-88ea-f0f89856cbe4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5mk4q\" (UID: \"698ca3be-58a9-4c4a-88ea-f0f89856cbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228695 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e490c27-453c-4b8b-8a27-f446aee2178b-audit-dir\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228727 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zvdk\" (UniqueName: \"kubernetes.io/projected/dbf033db-e8ba-43f1-bf88-287c3afc79e7-kube-api-access-5zvdk\") pod \"console-operator-58897d9998-xk4rn\" (UID: \"dbf033db-e8ba-43f1-bf88-287c3afc79e7\") " pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v6xk\" (UniqueName: \"kubernetes.io/projected/698ca3be-58a9-4c4a-88ea-f0f89856cbe4-kube-api-access-5v6xk\") pod \"openshift-apiserver-operator-796bbdcf4f-5mk4q\" (UID: \"698ca3be-58a9-4c4a-88ea-f0f89856cbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-service-ca-bundle\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228774 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228814 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228786 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228925 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228937 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwjwk\" (UniqueName: \"kubernetes.io/projected/1e490c27-453c-4b8b-8a27-f446aee2178b-kube-api-access-xwjwk\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.228997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-audit-policies\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.229016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dcec683-7e35-4aea-95da-b43baaf89e03-trusted-ca\") pod \"ingress-operator-5b745b69d9-7bfdt\" (UID: \"5dcec683-7e35-4aea-95da-b43baaf89e03\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.229032 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4a7417c-f1ec-4b88-9f86-8a4da4365429-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h64lc\" (UID: \"a4a7417c-f1ec-4b88-9f86-8a4da4365429\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.229046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698ca3be-58a9-4c4a-88ea-f0f89856cbe4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5mk4q\" (UID: \"698ca3be-58a9-4c4a-88ea-f0f89856cbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.229075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbf033db-e8ba-43f1-bf88-287c3afc79e7-trusted-ca\") pod \"console-operator-58897d9998-xk4rn\" (UID: \"dbf033db-e8ba-43f1-bf88-287c3afc79e7\") " pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.229089 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dc3a6c0f-3591-44b3-9044-b219d6a69787-machine-approver-tls\") pod \"machine-approver-56656f9798-ztkh5\" (UID: \"dc3a6c0f-3591-44b3-9044-b219d6a69787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.229164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dc3a6c0f-3591-44b3-9044-b219d6a69787-auth-proxy-config\") pod \"machine-approver-56656f9798-ztkh5\" (UID: \"dc3a6c0f-3591-44b3-9044-b219d6a69787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.229244 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.229305 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.229330 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.230182 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.233745 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.233975 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.234713 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.238200 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.239037 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.239237 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.239750 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.240749 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.242301 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.243438 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.246554 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.246584 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-w2nbb"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.246595 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.246952 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.247182 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.247750 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.248266 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.249638 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nwwtl"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.250254 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.251367 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.252854 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.253318 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.254747 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nsv4j"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.255200 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.257502 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.257929 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.261911 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.262278 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.262779 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.263115 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.264356 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gb6ks"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.265166 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gb6ks" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.266958 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nxdhr"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.267500 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.268086 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.269013 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4kx2"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.269112 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.269399 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.269575 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.269695 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.270646 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xk4rn"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.274327 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.275129 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.276585 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.278035 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.279826 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.280368 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2j6m9"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.285205 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4kcnk"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.288130 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.288185 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.295725 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x9q82"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.296759 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-g7kvk"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.302947 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4zj5j"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.303040 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g7kvk" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.304108 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h699h"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.305384 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.306767 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ql7n8"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.307670 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.308189 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.309370 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.311372 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4kx2"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.312128 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6smx5"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.314903 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.314924 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.315698 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.317838 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nwwtl"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.319424 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.320908 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nxdhr"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.322347 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.323817 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.324853 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqndp"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.325820 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gsm5p"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.326463 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gsm5p" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.327184 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2sc4v"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.327446 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.328412 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329710 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6prp\" (UniqueName: \"kubernetes.io/projected/79c4368a-f145-4cfe-96ed-c1076533fa3a-kube-api-access-f6prp\") pod \"package-server-manager-789f6589d5-tlpbw\" (UID: \"79c4368a-f145-4cfe-96ed-c1076533fa3a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329741 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rvvx\" (UniqueName: \"kubernetes.io/projected/43aae4a1-9504-4c81-9ff1-675c0b51ced2-kube-api-access-4rvvx\") pod \"downloads-7954f5f757-2j6m9\" (UID: \"43aae4a1-9504-4c81-9ff1-675c0b51ced2\") " pod="openshift-console/downloads-7954f5f757-2j6m9" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329761 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329779 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329803 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc3a6c0f-3591-44b3-9044-b219d6a69787-config\") pod \"machine-approver-56656f9798-ztkh5\" (UID: \"dc3a6c0f-3591-44b3-9044-b219d6a69787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329823 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8c57f71-dac4-492c-b4f9-884233468771-config\") pod \"kube-controller-manager-operator-78b949d7b-mb8kf\" (UID: \"a8c57f71-dac4-492c-b4f9-884233468771\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329838 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbf033db-e8ba-43f1-bf88-287c3afc79e7-serving-cert\") pod \"console-operator-58897d9998-xk4rn\" (UID: \"dbf033db-e8ba-43f1-bf88-287c3afc79e7\") " pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329854 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dcec683-7e35-4aea-95da-b43baaf89e03-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7bfdt\" (UID: \"5dcec683-7e35-4aea-95da-b43baaf89e03\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280cdbe4-3133-4c44-a483-0be2a86f2f36-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kf4bh\" (UID: \"280cdbe4-3133-4c44-a483-0be2a86f2f36\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329892 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2xc8\" (UniqueName: \"kubernetes.io/projected/5dcec683-7e35-4aea-95da-b43baaf89e03-kube-api-access-b2xc8\") pod \"ingress-operator-5b745b69d9-7bfdt\" (UID: \"5dcec683-7e35-4aea-95da-b43baaf89e03\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329965 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.329996 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf033db-e8ba-43f1-bf88-287c3afc79e7-config\") pod \"console-operator-58897d9998-xk4rn\" (UID: \"dbf033db-e8ba-43f1-bf88-287c3afc79e7\") " pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330011 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330030 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g2ww\" (UniqueName: \"kubernetes.io/projected/1980f6f8-707b-4dd0-955e-156b0e1598e3-kube-api-access-5g2ww\") pod \"cluster-image-registry-operator-dc59b4c8b-ntsgr\" (UID: \"1980f6f8-707b-4dd0-955e-156b0e1598e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1980f6f8-707b-4dd0-955e-156b0e1598e3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ntsgr\" (UID: \"1980f6f8-707b-4dd0-955e-156b0e1598e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330065 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/79c4368a-f145-4cfe-96ed-c1076533fa3a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tlpbw\" (UID: \"79c4368a-f145-4cfe-96ed-c1076533fa3a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330082 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-config\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c560bc26-ad30-401e-823c-66e9b1a999a1-serving-cert\") pod \"openshift-config-operator-7777fb866f-2hdb5\" (UID: \"c560bc26-ad30-401e-823c-66e9b1a999a1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330114 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e490c27-453c-4b8b-8a27-f446aee2178b-audit-dir\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330132 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg5q9\" (UniqueName: \"kubernetes.io/projected/a4a7417c-f1ec-4b88-9f86-8a4da4365429-kube-api-access-lg5q9\") pod \"cluster-samples-operator-665b6dd947-h64lc\" (UID: \"a4a7417c-f1ec-4b88-9f86-8a4da4365429\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330148 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/698ca3be-58a9-4c4a-88ea-f0f89856cbe4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5mk4q\" (UID: \"698ca3be-58a9-4c4a-88ea-f0f89856cbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5vjv\" (UniqueName: \"kubernetes.io/projected/7daba5cb-4db9-41c4-b8a7-3a865bb69776-kube-api-access-m5vjv\") pod \"kube-storage-version-migrator-operator-b67b599dd-8lpdc\" (UID: \"7daba5cb-4db9-41c4-b8a7-3a865bb69776\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330179 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/280cdbe4-3133-4c44-a483-0be2a86f2f36-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kf4bh\" (UID: \"280cdbe4-3133-4c44-a483-0be2a86f2f36\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330199 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v6xk\" (UniqueName: \"kubernetes.io/projected/698ca3be-58a9-4c4a-88ea-f0f89856cbe4-kube-api-access-5v6xk\") pod \"openshift-apiserver-operator-796bbdcf4f-5mk4q\" (UID: \"698ca3be-58a9-4c4a-88ea-f0f89856cbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330219 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zvdk\" (UniqueName: \"kubernetes.io/projected/dbf033db-e8ba-43f1-bf88-287c3afc79e7-kube-api-access-5zvdk\") pod \"console-operator-58897d9998-xk4rn\" (UID: \"dbf033db-e8ba-43f1-bf88-287c3afc79e7\") " pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330259 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-service-ca-bundle\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1980f6f8-707b-4dd0-955e-156b0e1598e3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ntsgr\" (UID: \"1980f6f8-707b-4dd0-955e-156b0e1598e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330292 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwjwk\" (UniqueName: \"kubernetes.io/projected/1e490c27-453c-4b8b-8a27-f446aee2178b-kube-api-access-xwjwk\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dcec683-7e35-4aea-95da-b43baaf89e03-trusted-ca\") pod \"ingress-operator-5b745b69d9-7bfdt\" (UID: \"5dcec683-7e35-4aea-95da-b43baaf89e03\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330323 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-audit-policies\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330345 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4a7417c-f1ec-4b88-9f86-8a4da4365429-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h64lc\" (UID: \"a4a7417c-f1ec-4b88-9f86-8a4da4365429\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698ca3be-58a9-4c4a-88ea-f0f89856cbe4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5mk4q\" (UID: \"698ca3be-58a9-4c4a-88ea-f0f89856cbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbf033db-e8ba-43f1-bf88-287c3afc79e7-trusted-ca\") pod \"console-operator-58897d9998-xk4rn\" (UID: \"dbf033db-e8ba-43f1-bf88-287c3afc79e7\") " pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330393 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dc3a6c0f-3591-44b3-9044-b219d6a69787-machine-approver-tls\") pod \"machine-approver-56656f9798-ztkh5\" (UID: \"dc3a6c0f-3591-44b3-9044-b219d6a69787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330409 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daba5cb-4db9-41c4-b8a7-3a865bb69776-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8lpdc\" (UID: \"7daba5cb-4db9-41c4-b8a7-3a865bb69776\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330423 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dc3a6c0f-3591-44b3-9044-b219d6a69787-auth-proxy-config\") pod \"machine-approver-56656f9798-ztkh5\" (UID: \"dc3a6c0f-3591-44b3-9044-b219d6a69787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330462 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqnxk\" (UniqueName: \"kubernetes.io/projected/c560bc26-ad30-401e-823c-66e9b1a999a1-kube-api-access-rqnxk\") pod \"openshift-config-operator-7777fb866f-2hdb5\" (UID: \"c560bc26-ad30-401e-823c-66e9b1a999a1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330478 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4368a25-e1cf-4ea8-89d3-18d75848a783-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-d2pbw\" (UID: \"a4368a25-e1cf-4ea8-89d3-18d75848a783\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330498 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8c57f71-dac4-492c-b4f9-884233468771-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mb8kf\" (UID: \"a8c57f71-dac4-492c-b4f9-884233468771\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330513 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daba5cb-4db9-41c4-b8a7-3a865bb69776-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8lpdc\" (UID: \"7daba5cb-4db9-41c4-b8a7-3a865bb69776\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330530 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330548 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dcec683-7e35-4aea-95da-b43baaf89e03-metrics-tls\") pod \"ingress-operator-5b745b69d9-7bfdt\" (UID: \"5dcec683-7e35-4aea-95da-b43baaf89e03\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330564 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8c57f71-dac4-492c-b4f9-884233468771-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mb8kf\" (UID: \"a8c57f71-dac4-492c-b4f9-884233468771\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330611 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-serving-cert\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330626 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd4h4\" (UniqueName: \"kubernetes.io/projected/dc3a6c0f-3591-44b3-9044-b219d6a69787-kube-api-access-qd4h4\") pod \"machine-approver-56656f9798-ztkh5\" (UID: \"dc3a6c0f-3591-44b3-9044-b219d6a69787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330659 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1980f6f8-707b-4dd0-955e-156b0e1598e3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ntsgr\" (UID: \"1980f6f8-707b-4dd0-955e-156b0e1598e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330676 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c560bc26-ad30-401e-823c-66e9b1a999a1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2hdb5\" (UID: \"c560bc26-ad30-401e-823c-66e9b1a999a1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330693 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnk9n\" (UniqueName: \"kubernetes.io/projected/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-kube-api-access-dnk9n\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330712 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4368a25-e1cf-4ea8-89d3-18d75848a783-config\") pod \"kube-apiserver-operator-766d6c64bb-d2pbw\" (UID: \"a4368a25-e1cf-4ea8-89d3-18d75848a783\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4368a25-e1cf-4ea8-89d3-18d75848a783-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-d2pbw\" (UID: \"a4368a25-e1cf-4ea8-89d3-18d75848a783\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.330743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/280cdbe4-3133-4c44-a483-0be2a86f2f36-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kf4bh\" (UID: \"280cdbe4-3133-4c44-a483-0be2a86f2f36\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.332501 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.332602 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dc3a6c0f-3591-44b3-9044-b219d6a69787-config\") pod \"machine-approver-56656f9798-ztkh5\" (UID: \"dc3a6c0f-3591-44b3-9044-b219d6a69787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.333066 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dcec683-7e35-4aea-95da-b43baaf89e03-trusted-ca\") pod \"ingress-operator-5b745b69d9-7bfdt\" (UID: \"5dcec683-7e35-4aea-95da-b43baaf89e03\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.333202 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbf033db-e8ba-43f1-bf88-287c3afc79e7-config\") pod \"console-operator-58897d9998-xk4rn\" (UID: \"dbf033db-e8ba-43f1-bf88-287c3afc79e7\") " pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.333241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.333289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dbf033db-e8ba-43f1-bf88-287c3afc79e7-trusted-ca\") pod \"console-operator-58897d9998-xk4rn\" (UID: \"dbf033db-e8ba-43f1-bf88-287c3afc79e7\") " pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.333775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-service-ca-bundle\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.333859 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-audit-policies\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.333966 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.334139 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dc3a6c0f-3591-44b3-9044-b219d6a69787-auth-proxy-config\") pod \"machine-approver-56656f9798-ztkh5\" (UID: \"dc3a6c0f-3591-44b3-9044-b219d6a69787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.335634 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.335794 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.335837 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/698ca3be-58a9-4c4a-88ea-f0f89856cbe4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5mk4q\" (UID: \"698ca3be-58a9-4c4a-88ea-f0f89856cbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.335911 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e490c27-453c-4b8b-8a27-f446aee2178b-audit-dir\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.335999 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.336211 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nrjl4"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.336789 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-config\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.337127 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.337586 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.337629 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.337788 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.338291 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dc3a6c0f-3591-44b3-9044-b219d6a69787-machine-approver-tls\") pod \"machine-approver-56656f9798-ztkh5\" (UID: \"dc3a6c0f-3591-44b3-9044-b219d6a69787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.338413 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbf033db-e8ba-43f1-bf88-287c3afc79e7-serving-cert\") pod \"console-operator-58897d9998-xk4rn\" (UID: \"dbf033db-e8ba-43f1-bf88-287c3afc79e7\") " pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.338483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a4a7417c-f1ec-4b88-9f86-8a4da4365429-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h64lc\" (UID: \"a4a7417c-f1ec-4b88-9f86-8a4da4365429\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.338715 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/698ca3be-58a9-4c4a-88ea-f0f89856cbe4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5mk4q\" (UID: \"698ca3be-58a9-4c4a-88ea-f0f89856cbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.338816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5dcec683-7e35-4aea-95da-b43baaf89e03-metrics-tls\") pod \"ingress-operator-5b745b69d9-7bfdt\" (UID: \"5dcec683-7e35-4aea-95da-b43baaf89e03\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.339011 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.339876 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.340034 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-serving-cert\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.340388 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.341357 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.341722 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.341726 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.342895 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.344305 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.344345 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gsm5p"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.345540 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gb6ks"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.347476 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nsv4j"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.347709 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.349105 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.350319 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nrjl4"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.351326 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4kcnk"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.351329 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.352263 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh"] Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.373449 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.387798 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.407936 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.427873 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daba5cb-4db9-41c4-b8a7-3a865bb69776-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8lpdc\" (UID: \"7daba5cb-4db9-41c4-b8a7-3a865bb69776\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431214 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqnxk\" (UniqueName: \"kubernetes.io/projected/c560bc26-ad30-401e-823c-66e9b1a999a1-kube-api-access-rqnxk\") pod \"openshift-config-operator-7777fb866f-2hdb5\" (UID: \"c560bc26-ad30-401e-823c-66e9b1a999a1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4368a25-e1cf-4ea8-89d3-18d75848a783-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-d2pbw\" (UID: \"a4368a25-e1cf-4ea8-89d3-18d75848a783\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431266 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8c57f71-dac4-492c-b4f9-884233468771-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mb8kf\" (UID: \"a8c57f71-dac4-492c-b4f9-884233468771\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431281 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daba5cb-4db9-41c4-b8a7-3a865bb69776-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8lpdc\" (UID: \"7daba5cb-4db9-41c4-b8a7-3a865bb69776\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431306 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1980f6f8-707b-4dd0-955e-156b0e1598e3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ntsgr\" (UID: \"1980f6f8-707b-4dd0-955e-156b0e1598e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431324 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8c57f71-dac4-492c-b4f9-884233468771-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mb8kf\" (UID: \"a8c57f71-dac4-492c-b4f9-884233468771\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431340 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c560bc26-ad30-401e-823c-66e9b1a999a1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2hdb5\" (UID: \"c560bc26-ad30-401e-823c-66e9b1a999a1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431365 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4368a25-e1cf-4ea8-89d3-18d75848a783-config\") pod \"kube-apiserver-operator-766d6c64bb-d2pbw\" (UID: \"a4368a25-e1cf-4ea8-89d3-18d75848a783\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431378 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4368a25-e1cf-4ea8-89d3-18d75848a783-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-d2pbw\" (UID: \"a4368a25-e1cf-4ea8-89d3-18d75848a783\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431394 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/280cdbe4-3133-4c44-a483-0be2a86f2f36-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kf4bh\" (UID: \"280cdbe4-3133-4c44-a483-0be2a86f2f36\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431416 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6prp\" (UniqueName: \"kubernetes.io/projected/79c4368a-f145-4cfe-96ed-c1076533fa3a-kube-api-access-f6prp\") pod \"package-server-manager-789f6589d5-tlpbw\" (UID: \"79c4368a-f145-4cfe-96ed-c1076533fa3a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431440 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8c57f71-dac4-492c-b4f9-884233468771-config\") pod \"kube-controller-manager-operator-78b949d7b-mb8kf\" (UID: \"a8c57f71-dac4-492c-b4f9-884233468771\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280cdbe4-3133-4c44-a483-0be2a86f2f36-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kf4bh\" (UID: \"280cdbe4-3133-4c44-a483-0be2a86f2f36\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g2ww\" (UniqueName: \"kubernetes.io/projected/1980f6f8-707b-4dd0-955e-156b0e1598e3-kube-api-access-5g2ww\") pod \"cluster-image-registry-operator-dc59b4c8b-ntsgr\" (UID: \"1980f6f8-707b-4dd0-955e-156b0e1598e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1980f6f8-707b-4dd0-955e-156b0e1598e3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ntsgr\" (UID: \"1980f6f8-707b-4dd0-955e-156b0e1598e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/79c4368a-f145-4cfe-96ed-c1076533fa3a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tlpbw\" (UID: \"79c4368a-f145-4cfe-96ed-c1076533fa3a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431548 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c560bc26-ad30-401e-823c-66e9b1a999a1-serving-cert\") pod \"openshift-config-operator-7777fb866f-2hdb5\" (UID: \"c560bc26-ad30-401e-823c-66e9b1a999a1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5vjv\" (UniqueName: \"kubernetes.io/projected/7daba5cb-4db9-41c4-b8a7-3a865bb69776-kube-api-access-m5vjv\") pod \"kube-storage-version-migrator-operator-b67b599dd-8lpdc\" (UID: \"7daba5cb-4db9-41c4-b8a7-3a865bb69776\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431588 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/280cdbe4-3133-4c44-a483-0be2a86f2f36-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kf4bh\" (UID: \"280cdbe4-3133-4c44-a483-0be2a86f2f36\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.431615 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1980f6f8-707b-4dd0-955e-156b0e1598e3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ntsgr\" (UID: \"1980f6f8-707b-4dd0-955e-156b0e1598e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.433024 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4368a25-e1cf-4ea8-89d3-18d75848a783-config\") pod \"kube-apiserver-operator-766d6c64bb-d2pbw\" (UID: \"a4368a25-e1cf-4ea8-89d3-18d75848a783\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.433056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c560bc26-ad30-401e-823c-66e9b1a999a1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2hdb5\" (UID: \"c560bc26-ad30-401e-823c-66e9b1a999a1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.433471 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1980f6f8-707b-4dd0-955e-156b0e1598e3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ntsgr\" (UID: \"1980f6f8-707b-4dd0-955e-156b0e1598e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.435832 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/79c4368a-f145-4cfe-96ed-c1076533fa3a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tlpbw\" (UID: \"79c4368a-f145-4cfe-96ed-c1076533fa3a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.448772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4368a25-e1cf-4ea8-89d3-18d75848a783-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-d2pbw\" (UID: \"a4368a25-e1cf-4ea8-89d3-18d75848a783\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.455137 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.467687 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.488249 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.507463 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.548574 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.556547 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c560bc26-ad30-401e-823c-66e9b1a999a1-serving-cert\") pod \"openshift-config-operator-7777fb866f-2hdb5\" (UID: \"c560bc26-ad30-401e-823c-66e9b1a999a1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.568358 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.589077 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.608557 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.628553 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.649062 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.668935 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.677187 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/280cdbe4-3133-4c44-a483-0be2a86f2f36-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kf4bh\" (UID: \"280cdbe4-3133-4c44-a483-0be2a86f2f36\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.688286 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.694035 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280cdbe4-3133-4c44-a483-0be2a86f2f36-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kf4bh\" (UID: \"280cdbe4-3133-4c44-a483-0be2a86f2f36\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.708441 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.729396 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.748817 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.756200 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7daba5cb-4db9-41c4-b8a7-3a865bb69776-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8lpdc\" (UID: \"7daba5cb-4db9-41c4-b8a7-3a865bb69776\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.768762 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.774157 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7daba5cb-4db9-41c4-b8a7-3a865bb69776-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8lpdc\" (UID: \"7daba5cb-4db9-41c4-b8a7-3a865bb69776\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.789042 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.807948 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.829071 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.849114 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.869164 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.888106 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.908651 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.928588 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.948254 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.957264 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8c57f71-dac4-492c-b4f9-884233468771-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mb8kf\" (UID: \"a8c57f71-dac4-492c-b4f9-884233468771\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.968396 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.974071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8c57f71-dac4-492c-b4f9-884233468771-config\") pod \"kube-controller-manager-operator-78b949d7b-mb8kf\" (UID: \"a8c57f71-dac4-492c-b4f9-884233468771\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.989482 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 01 13:08:04 crc kubenswrapper[4749]: I1001 13:08:04.996946 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1980f6f8-707b-4dd0-955e-156b0e1598e3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ntsgr\" (UID: \"1980f6f8-707b-4dd0-955e-156b0e1598e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.007646 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.049344 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.068008 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.088417 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.108708 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.128471 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.148871 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.168481 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.187722 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.208090 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.228876 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.248373 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.266544 4749 request.go:700] Waited for 1.011129233s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/secrets?fieldSelector=metadata.name%3Dconsole-dockercfg-f62pw&limit=500&resourceVersion=0 Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.269648 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.289175 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.308935 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.328139 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.359508 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.368806 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.388073 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.408971 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.429095 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.448938 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.468506 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.488206 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.508593 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.528705 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.547977 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.568322 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.587520 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.608495 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.629490 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.649559 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.668884 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.688859 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.708643 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.728109 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.748582 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.773436 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.797344 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.808851 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.828470 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.849069 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.868190 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.888007 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.908420 4749 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.928038 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.948017 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.968938 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 01 13:08:05 crc kubenswrapper[4749]: I1001 13:08:05.988500 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.008330 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.028864 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.048879 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.068477 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.088426 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.138128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rvvx\" (UniqueName: \"kubernetes.io/projected/43aae4a1-9504-4c81-9ff1-675c0b51ced2-kube-api-access-4rvvx\") pod \"downloads-7954f5f757-2j6m9\" (UID: \"43aae4a1-9504-4c81-9ff1-675c0b51ced2\") " pod="openshift-console/downloads-7954f5f757-2j6m9" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.158499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg5q9\" (UniqueName: \"kubernetes.io/projected/a4a7417c-f1ec-4b88-9f86-8a4da4365429-kube-api-access-lg5q9\") pod \"cluster-samples-operator-665b6dd947-h64lc\" (UID: \"a4a7417c-f1ec-4b88-9f86-8a4da4365429\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.181675 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v6xk\" (UniqueName: \"kubernetes.io/projected/698ca3be-58a9-4c4a-88ea-f0f89856cbe4-kube-api-access-5v6xk\") pod \"openshift-apiserver-operator-796bbdcf4f-5mk4q\" (UID: \"698ca3be-58a9-4c4a-88ea-f0f89856cbe4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.214260 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zvdk\" (UniqueName: \"kubernetes.io/projected/dbf033db-e8ba-43f1-bf88-287c3afc79e7-kube-api-access-5zvdk\") pod \"console-operator-58897d9998-xk4rn\" (UID: \"dbf033db-e8ba-43f1-bf88-287c3afc79e7\") " pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.217458 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dcec683-7e35-4aea-95da-b43baaf89e03-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7bfdt\" (UID: \"5dcec683-7e35-4aea-95da-b43baaf89e03\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.235142 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.244204 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwjwk\" (UniqueName: \"kubernetes.io/projected/1e490c27-453c-4b8b-8a27-f446aee2178b-kube-api-access-xwjwk\") pod \"oauth-openshift-558db77b4-4zj5j\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.248281 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.253077 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.268538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnk9n\" (UniqueName: \"kubernetes.io/projected/9346ab8a-f5d2-4d33-be7f-4b7fe5687044-kube-api-access-dnk9n\") pod \"authentication-operator-69f744f599-w2nbb\" (UID: \"9346ab8a-f5d2-4d33-be7f-4b7fe5687044\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.269011 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.286687 4749 request.go:700] Waited for 1.949202203s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/serviceaccounts/ingress-operator/token Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.298369 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2j6m9" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.308880 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.315921 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2xc8\" (UniqueName: \"kubernetes.io/projected/5dcec683-7e35-4aea-95da-b43baaf89e03-kube-api-access-b2xc8\") pod \"ingress-operator-5b745b69d9-7bfdt\" (UID: \"5dcec683-7e35-4aea-95da-b43baaf89e03\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.340812 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.360203 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd4h4\" (UniqueName: \"kubernetes.io/projected/dc3a6c0f-3591-44b3-9044-b219d6a69787-kube-api-access-qd4h4\") pod \"machine-approver-56656f9798-ztkh5\" (UID: \"dc3a6c0f-3591-44b3-9044-b219d6a69787\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.361637 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.373063 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1980f6f8-707b-4dd0-955e-156b0e1598e3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ntsgr\" (UID: \"1980f6f8-707b-4dd0-955e-156b0e1598e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.390992 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqnxk\" (UniqueName: \"kubernetes.io/projected/c560bc26-ad30-401e-823c-66e9b1a999a1-kube-api-access-rqnxk\") pod \"openshift-config-operator-7777fb866f-2hdb5\" (UID: \"c560bc26-ad30-401e-823c-66e9b1a999a1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.420421 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6prp\" (UniqueName: \"kubernetes.io/projected/79c4368a-f145-4cfe-96ed-c1076533fa3a-kube-api-access-f6prp\") pod \"package-server-manager-789f6589d5-tlpbw\" (UID: \"79c4368a-f145-4cfe-96ed-c1076533fa3a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.426837 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8c57f71-dac4-492c-b4f9-884233468771-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mb8kf\" (UID: \"a8c57f71-dac4-492c-b4f9-884233468771\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.446199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/280cdbe4-3133-4c44-a483-0be2a86f2f36-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-kf4bh\" (UID: \"280cdbe4-3133-4c44-a483-0be2a86f2f36\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.466746 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a4368a25-e1cf-4ea8-89d3-18d75848a783-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-d2pbw\" (UID: \"a4368a25-e1cf-4ea8-89d3-18d75848a783\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.473475 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.482939 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.485685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g2ww\" (UniqueName: \"kubernetes.io/projected/1980f6f8-707b-4dd0-955e-156b0e1598e3-kube-api-access-5g2ww\") pod \"cluster-image-registry-operator-dc59b4c8b-ntsgr\" (UID: \"1980f6f8-707b-4dd0-955e-156b0e1598e3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.498479 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.503905 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5vjv\" (UniqueName: \"kubernetes.io/projected/7daba5cb-4db9-41c4-b8a7-3a865bb69776-kube-api-access-m5vjv\") pod \"kube-storage-version-migrator-operator-b67b599dd-8lpdc\" (UID: \"7daba5cb-4db9-41c4-b8a7-3a865bb69776\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.504727 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.506687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.557059 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q"] Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.563901 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564524 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ba15709-180a-4045-9d19-df6de2d8cf6e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564562 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqqtd\" (UniqueName: \"kubernetes.io/projected/e7b739c8-2b6d-47da-b5a6-f40a3a1ee298-kube-api-access-qqqtd\") pod \"migrator-59844c95c7-2sc4v\" (UID: \"e7b739c8-2b6d-47da-b5a6-f40a3a1ee298\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2sc4v" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564602 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-bound-sa-token\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564617 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89ddaa81-0040-4e7c-8207-dc77ca8b888a-metrics-tls\") pod \"dns-operator-744455d44c-h699h\" (UID: \"89ddaa81-0040-4e7c-8207-dc77ca8b888a\") " pod="openshift-dns-operator/dns-operator-744455d44c-h699h" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564633 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhz8z\" (UniqueName: \"kubernetes.io/projected/744adf1a-749a-4816-926e-540e6f80acc0-kube-api-access-qhz8z\") pod \"olm-operator-6b444d44fb-4chs2\" (UID: \"744adf1a-749a-4816-926e-540e6f80acc0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564649 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97489cec-2ad5-4b5c-89fd-d51ab641c126-service-ca-bundle\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564679 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7t5q\" (UniqueName: \"kubernetes.io/projected/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-kube-api-access-x7t5q\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564696 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d30b63f-8173-4be0-a523-f69f604cb48a-serving-cert\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564768 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/97489cec-2ad5-4b5c-89fd-d51ab641c126-stats-auth\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564783 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-client-ca\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564798 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-audit\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564813 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee9f95c-7f21-46fa-bbf4-fc06601b9e42-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrtxd\" (UID: \"1ee9f95c-7f21-46fa-bbf4-fc06601b9e42\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564827 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fbdc3512-8dd4-4298-b49b-63a7fc87040c-etcd-service-ca\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564841 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564869 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/744adf1a-749a-4816-926e-540e6f80acc0-srv-cert\") pod \"olm-operator-6b444d44fb-4chs2\" (UID: \"744adf1a-749a-4816-926e-540e6f80acc0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4cw7\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-kube-api-access-w4cw7\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564907 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fbdc3512-8dd4-4298-b49b-63a7fc87040c-etcd-client\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564938 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.564977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cp9j\" (UniqueName: \"kubernetes.io/projected/0499d12a-7da9-4ece-9b62-87b0141c91f1-kube-api-access-7cp9j\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565000 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-config\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565018 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ba15709-180a-4045-9d19-df6de2d8cf6e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565058 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgfvx\" (UniqueName: \"kubernetes.io/projected/1d30b63f-8173-4be0-a523-f69f604cb48a-kube-api-access-kgfvx\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565074 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ba15709-180a-4045-9d19-df6de2d8cf6e-registry-certificates\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565088 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0499d12a-7da9-4ece-9b62-87b0141c91f1-etcd-client\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-serving-cert\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbdc3512-8dd4-4298-b49b-63a7fc87040c-config\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d30b63f-8173-4be0-a523-f69f604cb48a-audit-dir\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565167 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxlfg\" (UniqueName: \"kubernetes.io/projected/1ee9f95c-7f21-46fa-bbf4-fc06601b9e42-kube-api-access-wxlfg\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrtxd\" (UID: \"1ee9f95c-7f21-46fa-bbf4-fc06601b9e42\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0499d12a-7da9-4ece-9b62-87b0141c91f1-audit-policies\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frf2s\" (UniqueName: \"kubernetes.io/projected/89ddaa81-0040-4e7c-8207-dc77ca8b888a-kube-api-access-frf2s\") pod \"dns-operator-744455d44c-h699h\" (UID: \"89ddaa81-0040-4e7c-8207-dc77ca8b888a\") " pod="openshift-dns-operator/dns-operator-744455d44c-h699h" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/97489cec-2ad5-4b5c-89fd-d51ab641c126-default-certificate\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565321 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-registry-tls\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565339 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0499d12a-7da9-4ece-9b62-87b0141c91f1-encryption-config\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565353 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbdc3512-8dd4-4298-b49b-63a7fc87040c-serving-cert\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565377 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l9zm\" (UniqueName: \"kubernetes.io/projected/31175322-99a0-4224-82d9-ca63e5a241c8-kube-api-access-8l9zm\") pod \"control-plane-machine-set-operator-78cbb6b69f-8twps\" (UID: \"31175322-99a0-4224-82d9-ca63e5a241c8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565392 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ba15709-180a-4045-9d19-df6de2d8cf6e-trusted-ca\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565422 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0499d12a-7da9-4ece-9b62-87b0141c91f1-audit-dir\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d30b63f-8173-4be0-a523-f69f604cb48a-etcd-client\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565457 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97489cec-2ad5-4b5c-89fd-d51ab641c126-metrics-certs\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565471 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fbdc3512-8dd4-4298-b49b-63a7fc87040c-etcd-ca\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j29wx\" (UniqueName: \"kubernetes.io/projected/fbdc3512-8dd4-4298-b49b-63a7fc87040c-kube-api-access-j29wx\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565499 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1d30b63f-8173-4be0-a523-f69f604cb48a-node-pullsecrets\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565515 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0499d12a-7da9-4ece-9b62-87b0141c91f1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0499d12a-7da9-4ece-9b62-87b0141c91f1-serving-cert\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565546 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-etcd-serving-ca\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565561 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d30b63f-8173-4be0-a523-f69f604cb48a-encryption-config\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565576 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ee9f95c-7f21-46fa-bbf4-fc06601b9e42-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrtxd\" (UID: \"1ee9f95c-7f21-46fa-bbf4-fc06601b9e42\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-config\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565631 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w627z\" (UniqueName: \"kubernetes.io/projected/97489cec-2ad5-4b5c-89fd-d51ab641c126-kube-api-access-w627z\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565648 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/744adf1a-749a-4816-926e-540e6f80acc0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4chs2\" (UID: \"744adf1a-749a-4816-926e-540e6f80acc0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565646 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xk4rn"] Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0499d12a-7da9-4ece-9b62-87b0141c91f1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565698 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-image-import-ca\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.565714 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/31175322-99a0-4224-82d9-ca63e5a241c8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8twps\" (UID: \"31175322-99a0-4224-82d9-ca63e5a241c8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps" Oct 01 13:08:06 crc kubenswrapper[4749]: E1001 13:08:06.566955 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.066942843 +0000 UTC m=+147.120927742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.647015 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc"] Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.652020 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.666563 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.666816 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-oauth-config\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.666843 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-oauth-serving-cert\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.666870 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fbdc3512-8dd4-4298-b49b-63a7fc87040c-etcd-service-ca\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.666896 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.666917 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64nc8\" (UniqueName: \"kubernetes.io/projected/2c421ec0-cb43-4116-87a5-9621322f8a33-kube-api-access-64nc8\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.666943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4cw7\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-kube-api-access-w4cw7\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.666963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fbdc3512-8dd4-4298-b49b-63a7fc87040c-etcd-client\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.666980 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/744adf1a-749a-4816-926e-540e6f80acc0-srv-cert\") pod \"olm-operator-6b444d44fb-4chs2\" (UID: \"744adf1a-749a-4816-926e-540e6f80acc0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.666998 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5321a24-d271-46cb-9d0a-fde8089a6ddc-config-volume\") pod \"collect-profiles-29322060-6hjrx\" (UID: \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667024 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78eab148-c7db-4caf-99dd-7576fdee2366-config\") pod \"machine-api-operator-5694c8668f-nwwtl\" (UID: \"78eab148-c7db-4caf-99dd-7576fdee2366\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667106 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjn2j\" (UniqueName: \"kubernetes.io/projected/c4026781-41c8-4fbc-8350-a1ed556054b0-kube-api-access-mjn2j\") pod \"dns-default-nrjl4\" (UID: \"c4026781-41c8-4fbc-8350-a1ed556054b0\") " pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667128 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-service-ca\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ba15709-180a-4045-9d19-df6de2d8cf6e-registry-certificates\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667192 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87afcd70-2daf-4b05-a299-982f75f0a5fa-cert\") pod \"ingress-canary-gsm5p\" (UID: \"87afcd70-2daf-4b05-a299-982f75f0a5fa\") " pod="openshift-ingress-canary/ingress-canary-gsm5p" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667233 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lghp4\" (UniqueName: \"kubernetes.io/projected/d5321a24-d271-46cb-9d0a-fde8089a6ddc-kube-api-access-lghp4\") pod \"collect-profiles-29322060-6hjrx\" (UID: \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-serving-cert\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667291 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d30b63f-8173-4be0-a523-f69f604cb48a-audit-dir\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667312 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxlfg\" (UniqueName: \"kubernetes.io/projected/1ee9f95c-7f21-46fa-bbf4-fc06601b9e42-kube-api-access-wxlfg\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrtxd\" (UID: \"1ee9f95c-7f21-46fa-bbf4-fc06601b9e42\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667333 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-csi-data-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667358 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqqrc\" (UniqueName: \"kubernetes.io/projected/e4084a94-565b-4599-b786-e1c0ed986450-kube-api-access-sqqrc\") pod \"machine-config-controller-84d6567774-4bxbv\" (UID: \"e4084a94-565b-4599-b786-e1c0ed986450\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667383 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0499d12a-7da9-4ece-9b62-87b0141c91f1-audit-policies\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667408 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30a10511-6a81-4150-9ae3-976a8062accc-config\") pod \"route-controller-manager-6576b87f9c-4v2g5\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/97489cec-2ad5-4b5c-89fd-d51ab641c126-default-certificate\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667452 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30a10511-6a81-4150-9ae3-976a8062accc-client-ca\") pod \"route-controller-manager-6576b87f9c-4v2g5\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-registry-tls\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667496 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgmhs\" (UniqueName: \"kubernetes.io/projected/13fabc72-2ab9-42f6-8e89-7868e3dd0228-kube-api-access-tgmhs\") pod \"service-ca-9c57cc56f-nxdhr\" (UID: \"13fabc72-2ab9-42f6-8e89-7868e3dd0228\") " pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6d282991-125e-45a4-b6d7-65c5919cfbfa-images\") pod \"machine-config-operator-74547568cd-jrr4f\" (UID: \"6d282991-125e-45a4-b6d7-65c5919cfbfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667551 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ba15709-180a-4045-9d19-df6de2d8cf6e-trusted-ca\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667573 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6d282991-125e-45a4-b6d7-65c5919cfbfa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jrr4f\" (UID: \"6d282991-125e-45a4-b6d7-65c5919cfbfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667618 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0499d12a-7da9-4ece-9b62-87b0141c91f1-serving-cert\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1d30b63f-8173-4be0-a523-f69f604cb48a-node-pullsecrets\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpwn2\" (UniqueName: \"kubernetes.io/projected/30a10511-6a81-4150-9ae3-976a8062accc-kube-api-access-hpwn2\") pod \"route-controller-manager-6576b87f9c-4v2g5\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667697 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ee9f95c-7f21-46fa-bbf4-fc06601b9e42-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrtxd\" (UID: \"1ee9f95c-7f21-46fa-bbf4-fc06601b9e42\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dfbc76ab-f650-4afe-91a5-a03c5577b4f2-certs\") pod \"machine-config-server-g7kvk\" (UID: \"dfbc76ab-f650-4afe-91a5-a03c5577b4f2\") " pod="openshift-machine-config-operator/machine-config-server-g7kvk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f9b5edf5-c6f0-48e3-b97c-c26c2939ec50-apiservice-cert\") pod \"packageserver-d55dfcdfc-4g84g\" (UID: \"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667764 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w627z\" (UniqueName: \"kubernetes.io/projected/97489cec-2ad5-4b5c-89fd-d51ab641c126-kube-api-access-w627z\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667785 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a10511-6a81-4150-9ae3-976a8062accc-serving-cert\") pod \"route-controller-manager-6576b87f9c-4v2g5\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667810 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-registration-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-plugins-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85af42c2-35fc-4545-8fb2-22ab8beb3e22-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n4kx2\" (UID: \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667902 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e7332bc1-2523-40a2-9a9c-494940d63f5b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gb6ks\" (UID: \"e7332bc1-2523-40a2-9a9c-494940d63f5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gb6ks" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667920 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-serving-cert\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.667978 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dfbc76ab-f650-4afe-91a5-a03c5577b4f2-node-bootstrap-token\") pod \"machine-config-server-g7kvk\" (UID: \"dfbc76ab-f650-4afe-91a5-a03c5577b4f2\") " pod="openshift-machine-config-operator/machine-config-server-g7kvk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhz8z\" (UniqueName: \"kubernetes.io/projected/744adf1a-749a-4816-926e-540e6f80acc0-kube-api-access-qhz8z\") pod \"olm-operator-6b444d44fb-4chs2\" (UID: \"744adf1a-749a-4816-926e-540e6f80acc0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87f666c4-4000-49c8-aadb-027878d0833f-serving-cert\") pod \"service-ca-operator-777779d784-lvwn9\" (UID: \"87f666c4-4000-49c8-aadb-027878d0833f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668057 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d30b63f-8173-4be0-a523-f69f604cb48a-serving-cert\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/97489cec-2ad5-4b5c-89fd-d51ab641c126-stats-auth\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-audit\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f9b5edf5-c6f0-48e3-b97c-c26c2939ec50-webhook-cert\") pod \"packageserver-d55dfcdfc-4g84g\" (UID: \"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668165 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/78eab148-c7db-4caf-99dd-7576fdee2366-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nwwtl\" (UID: \"78eab148-c7db-4caf-99dd-7576fdee2366\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668185 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-config\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668285 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4026781-41c8-4fbc-8350-a1ed556054b0-config-volume\") pod \"dns-default-nrjl4\" (UID: \"c4026781-41c8-4fbc-8350-a1ed556054b0\") " pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668308 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cp9j\" (UniqueName: \"kubernetes.io/projected/0499d12a-7da9-4ece-9b62-87b0141c91f1-kube-api-access-7cp9j\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668341 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ba15709-180a-4045-9d19-df6de2d8cf6e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668360 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-config\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668382 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnvft\" (UniqueName: \"kubernetes.io/projected/dfbc76ab-f650-4afe-91a5-a03c5577b4f2-kube-api-access-pnvft\") pod \"machine-config-server-g7kvk\" (UID: \"dfbc76ab-f650-4afe-91a5-a03c5577b4f2\") " pod="openshift-machine-config-operator/machine-config-server-g7kvk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668401 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f9b5edf5-c6f0-48e3-b97c-c26c2939ec50-tmpfs\") pod \"packageserver-d55dfcdfc-4g84g\" (UID: \"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgfvx\" (UniqueName: \"kubernetes.io/projected/1d30b63f-8173-4be0-a523-f69f604cb48a-kube-api-access-kgfvx\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668457 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0499d12a-7da9-4ece-9b62-87b0141c91f1-etcd-client\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668478 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e4084a94-565b-4599-b786-e1c0ed986450-proxy-tls\") pod \"machine-config-controller-84d6567774-4bxbv\" (UID: \"e4084a94-565b-4599-b786-e1c0ed986450\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668498 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdv2b\" (UniqueName: \"kubernetes.io/projected/e7332bc1-2523-40a2-9a9c-494940d63f5b-kube-api-access-mdv2b\") pod \"multus-admission-controller-857f4d67dd-gb6ks\" (UID: \"e7332bc1-2523-40a2-9a9c-494940d63f5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gb6ks" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d282991-125e-45a4-b6d7-65c5919cfbfa-proxy-tls\") pod \"machine-config-operator-74547568cd-jrr4f\" (UID: \"6d282991-125e-45a4-b6d7-65c5919cfbfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668538 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbdc3512-8dd4-4298-b49b-63a7fc87040c-config\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668557 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/13fabc72-2ab9-42f6-8e89-7868e3dd0228-signing-cabundle\") pod \"service-ca-9c57cc56f-nxdhr\" (UID: \"13fabc72-2ab9-42f6-8e89-7868e3dd0228\") " pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668578 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frf2s\" (UniqueName: \"kubernetes.io/projected/89ddaa81-0040-4e7c-8207-dc77ca8b888a-kube-api-access-frf2s\") pod \"dns-operator-744455d44c-h699h\" (UID: \"89ddaa81-0040-4e7c-8207-dc77ca8b888a\") " pod="openshift-dns-operator/dns-operator-744455d44c-h699h" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668598 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f666c4-4000-49c8-aadb-027878d0833f-config\") pod \"service-ca-operator-777779d784-lvwn9\" (UID: \"87f666c4-4000-49c8-aadb-027878d0833f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-socket-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668657 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0499d12a-7da9-4ece-9b62-87b0141c91f1-encryption-config\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668684 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbdc3512-8dd4-4298-b49b-63a7fc87040c-serving-cert\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668710 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8274c03b-8bd5-4bbc-bdbe-9b827ce71190-profile-collector-cert\") pod \"catalog-operator-68c6474976-x7drh\" (UID: \"8274c03b-8bd5-4bbc-bdbe-9b827ce71190\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668737 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l9zm\" (UniqueName: \"kubernetes.io/projected/31175322-99a0-4224-82d9-ca63e5a241c8-kube-api-access-8l9zm\") pod \"control-plane-machine-set-operator-78cbb6b69f-8twps\" (UID: \"31175322-99a0-4224-82d9-ca63e5a241c8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0499d12a-7da9-4ece-9b62-87b0141c91f1-audit-dir\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668793 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d30b63f-8173-4be0-a523-f69f604cb48a-etcd-client\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4026781-41c8-4fbc-8350-a1ed556054b0-metrics-tls\") pod \"dns-default-nrjl4\" (UID: \"c4026781-41c8-4fbc-8350-a1ed556054b0\") " pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668838 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97489cec-2ad5-4b5c-89fd-d51ab641c126-metrics-certs\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668858 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdnh7\" (UniqueName: \"kubernetes.io/projected/85af42c2-35fc-4545-8fb2-22ab8beb3e22-kube-api-access-wdnh7\") pod \"marketplace-operator-79b997595-n4kx2\" (UID: \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668878 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcvmb\" (UniqueName: \"kubernetes.io/projected/6d282991-125e-45a4-b6d7-65c5919cfbfa-kube-api-access-qcvmb\") pod \"machine-config-operator-74547568cd-jrr4f\" (UID: \"6d282991-125e-45a4-b6d7-65c5919cfbfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668901 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0499d12a-7da9-4ece-9b62-87b0141c91f1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668920 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fbdc3512-8dd4-4298-b49b-63a7fc87040c-etcd-ca\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668941 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j29wx\" (UniqueName: \"kubernetes.io/projected/fbdc3512-8dd4-4298-b49b-63a7fc87040c-kube-api-access-j29wx\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668966 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zghw\" (UniqueName: \"kubernetes.io/projected/87afcd70-2daf-4b05-a299-982f75f0a5fa-kube-api-access-8zghw\") pod \"ingress-canary-gsm5p\" (UID: \"87afcd70-2daf-4b05-a299-982f75f0a5fa\") " pod="openshift-ingress-canary/ingress-canary-gsm5p" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.668990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78eab148-c7db-4caf-99dd-7576fdee2366-images\") pod \"machine-api-operator-5694c8668f-nwwtl\" (UID: \"78eab148-c7db-4caf-99dd-7576fdee2366\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669039 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbj6p\" (UniqueName: \"kubernetes.io/projected/78eab148-c7db-4caf-99dd-7576fdee2366-kube-api-access-rbj6p\") pod \"machine-api-operator-5694c8668f-nwwtl\" (UID: \"78eab148-c7db-4caf-99dd-7576fdee2366\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669078 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-etcd-serving-ca\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: E1001 13:08:06.669098 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.169078752 +0000 UTC m=+147.223063741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669136 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d30b63f-8173-4be0-a523-f69f604cb48a-encryption-config\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-config\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669200 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6tsd\" (UniqueName: \"kubernetes.io/projected/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-kube-api-access-g6tsd\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/744adf1a-749a-4816-926e-540e6f80acc0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4chs2\" (UID: \"744adf1a-749a-4816-926e-540e6f80acc0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669263 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5321a24-d271-46cb-9d0a-fde8089a6ddc-secret-volume\") pod \"collect-profiles-29322060-6hjrx\" (UID: \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl7r8\" (UniqueName: \"kubernetes.io/projected/87f666c4-4000-49c8-aadb-027878d0833f-kube-api-access-xl7r8\") pod \"service-ca-operator-777779d784-lvwn9\" (UID: \"87f666c4-4000-49c8-aadb-027878d0833f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669310 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-image-import-ca\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669333 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/31175322-99a0-4224-82d9-ca63e5a241c8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8twps\" (UID: \"31175322-99a0-4224-82d9-ca63e5a241c8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669359 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0499d12a-7da9-4ece-9b62-87b0141c91f1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669377 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-mountpoint-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669393 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-trusted-ca-bundle\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669411 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8274c03b-8bd5-4bbc-bdbe-9b827ce71190-srv-cert\") pod \"catalog-operator-68c6474976-x7drh\" (UID: \"8274c03b-8bd5-4bbc-bdbe-9b827ce71190\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ba15709-180a-4045-9d19-df6de2d8cf6e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqqtd\" (UniqueName: \"kubernetes.io/projected/e7b739c8-2b6d-47da-b5a6-f40a3a1ee298-kube-api-access-qqqtd\") pod \"migrator-59844c95c7-2sc4v\" (UID: \"e7b739c8-2b6d-47da-b5a6-f40a3a1ee298\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2sc4v" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4084a94-565b-4599-b786-e1c0ed986450-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4bxbv\" (UID: \"e4084a94-565b-4599-b786-e1c0ed986450\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669805 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-bound-sa-token\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669826 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89ddaa81-0040-4e7c-8207-dc77ca8b888a-metrics-tls\") pod \"dns-operator-744455d44c-h699h\" (UID: \"89ddaa81-0040-4e7c-8207-dc77ca8b888a\") " pod="openshift-dns-operator/dns-operator-744455d44c-h699h" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97489cec-2ad5-4b5c-89fd-d51ab641c126-service-ca-bundle\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7t5q\" (UniqueName: \"kubernetes.io/projected/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-kube-api-access-x7t5q\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwh8q\" (UniqueName: \"kubernetes.io/projected/8274c03b-8bd5-4bbc-bdbe-9b827ce71190-kube-api-access-cwh8q\") pod \"catalog-operator-68c6474976-x7drh\" (UID: \"8274c03b-8bd5-4bbc-bdbe-9b827ce71190\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669903 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/13fabc72-2ab9-42f6-8e89-7868e3dd0228-signing-key\") pod \"service-ca-9c57cc56f-nxdhr\" (UID: \"13fabc72-2ab9-42f6-8e89-7868e3dd0228\") " pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85af42c2-35fc-4545-8fb2-22ab8beb3e22-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n4kx2\" (UID: \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669942 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669959 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-client-ca\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669979 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jfzr\" (UniqueName: \"kubernetes.io/projected/f9b5edf5-c6f0-48e3-b97c-c26c2939ec50-kube-api-access-2jfzr\") pod \"packageserver-d55dfcdfc-4g84g\" (UID: \"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.669998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee9f95c-7f21-46fa-bbf4-fc06601b9e42-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrtxd\" (UID: \"1ee9f95c-7f21-46fa-bbf4-fc06601b9e42\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.671451 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.672029 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0499d12a-7da9-4ece-9b62-87b0141c91f1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.672183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/fbdc3512-8dd4-4298-b49b-63a7fc87040c-etcd-service-ca\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.672274 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ee9f95c-7f21-46fa-bbf4-fc06601b9e42-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrtxd\" (UID: \"1ee9f95c-7f21-46fa-bbf4-fc06601b9e42\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.672515 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.672864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1d30b63f-8173-4be0-a523-f69f604cb48a-node-pullsecrets\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.673734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97489cec-2ad5-4b5c-89fd-d51ab641c126-service-ca-bundle\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.673757 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-etcd-serving-ca\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.674823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbdc3512-8dd4-4298-b49b-63a7fc87040c-config\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.675618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0499d12a-7da9-4ece-9b62-87b0141c91f1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.676488 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0499d12a-7da9-4ece-9b62-87b0141c91f1-serving-cert\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.677235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-image-import-ca\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.677616 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0499d12a-7da9-4ece-9b62-87b0141c91f1-audit-policies\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.678846 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-client-ca\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.694577 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-config\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.694866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.695195 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0499d12a-7da9-4ece-9b62-87b0141c91f1-etcd-client\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.695339 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-serving-cert\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.696152 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d30b63f-8173-4be0-a523-f69f604cb48a-serving-cert\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.696178 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/fbdc3512-8dd4-4298-b49b-63a7fc87040c-etcd-client\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.697778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/744adf1a-749a-4816-926e-540e6f80acc0-srv-cert\") pod \"olm-operator-6b444d44fb-4chs2\" (UID: \"744adf1a-749a-4816-926e-540e6f80acc0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.698792 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ba15709-180a-4045-9d19-df6de2d8cf6e-registry-certificates\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.699325 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ee9f95c-7f21-46fa-bbf4-fc06601b9e42-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrtxd\" (UID: \"1ee9f95c-7f21-46fa-bbf4-fc06601b9e42\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.699663 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/fbdc3512-8dd4-4298-b49b-63a7fc87040c-etcd-ca\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.699921 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0499d12a-7da9-4ece-9b62-87b0141c91f1-audit-dir\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.700159 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97489cec-2ad5-4b5c-89fd-d51ab641c126-metrics-certs\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.701456 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d30b63f-8173-4be0-a523-f69f604cb48a-etcd-client\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.701793 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ba15709-180a-4045-9d19-df6de2d8cf6e-trusted-ca\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.702304 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-config\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: E1001 13:08:06.702649 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.202628447 +0000 UTC m=+147.256613346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.703713 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d30b63f-8173-4be0-a523-f69f604cb48a-audit-dir\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.704502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ba15709-180a-4045-9d19-df6de2d8cf6e-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.706307 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/89ddaa81-0040-4e7c-8207-dc77ca8b888a-metrics-tls\") pod \"dns-operator-744455d44c-h699h\" (UID: \"89ddaa81-0040-4e7c-8207-dc77ca8b888a\") " pod="openshift-dns-operator/dns-operator-744455d44c-h699h" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.706337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fbdc3512-8dd4-4298-b49b-63a7fc87040c-serving-cert\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.707598 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0499d12a-7da9-4ece-9b62-87b0141c91f1-encryption-config\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.707978 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ba15709-180a-4045-9d19-df6de2d8cf6e-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.708684 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/31175322-99a0-4224-82d9-ca63e5a241c8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8twps\" (UID: \"31175322-99a0-4224-82d9-ca63e5a241c8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.710392 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/97489cec-2ad5-4b5c-89fd-d51ab641c126-default-certificate\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.711018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/97489cec-2ad5-4b5c-89fd-d51ab641c126-stats-auth\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.711184 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-registry-tls\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.711977 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1d30b63f-8173-4be0-a523-f69f604cb48a-encryption-config\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.712436 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/744adf1a-749a-4816-926e-540e6f80acc0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-4chs2\" (UID: \"744adf1a-749a-4816-926e-540e6f80acc0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.708961 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1d30b63f-8173-4be0-a523-f69f604cb48a-audit\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.712728 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.715692 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4zj5j"] Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.720894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4cw7\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-kube-api-access-w4cw7\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.749472 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l9zm\" (UniqueName: \"kubernetes.io/projected/31175322-99a0-4224-82d9-ca63e5a241c8-kube-api-access-8l9zm\") pod \"control-plane-machine-set-operator-78cbb6b69f-8twps\" (UID: \"31175322-99a0-4224-82d9-ca63e5a241c8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.766554 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2j6m9"] Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.767737 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgfvx\" (UniqueName: \"kubernetes.io/projected/1d30b63f-8173-4be0-a523-f69f604cb48a-kube-api-access-kgfvx\") pod \"apiserver-76f77b778f-x9q82\" (UID: \"1d30b63f-8173-4be0-a523-f69f604cb48a\") " pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.773354 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.773573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4026781-41c8-4fbc-8350-a1ed556054b0-metrics-tls\") pod \"dns-default-nrjl4\" (UID: \"c4026781-41c8-4fbc-8350-a1ed556054b0\") " pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.773601 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdnh7\" (UniqueName: \"kubernetes.io/projected/85af42c2-35fc-4545-8fb2-22ab8beb3e22-kube-api-access-wdnh7\") pod \"marketplace-operator-79b997595-n4kx2\" (UID: \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.773620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcvmb\" (UniqueName: \"kubernetes.io/projected/6d282991-125e-45a4-b6d7-65c5919cfbfa-kube-api-access-qcvmb\") pod \"machine-config-operator-74547568cd-jrr4f\" (UID: \"6d282991-125e-45a4-b6d7-65c5919cfbfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.773643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zghw\" (UniqueName: \"kubernetes.io/projected/87afcd70-2daf-4b05-a299-982f75f0a5fa-kube-api-access-8zghw\") pod \"ingress-canary-gsm5p\" (UID: \"87afcd70-2daf-4b05-a299-982f75f0a5fa\") " pod="openshift-ingress-canary/ingress-canary-gsm5p" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.773662 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbj6p\" (UniqueName: \"kubernetes.io/projected/78eab148-c7db-4caf-99dd-7576fdee2366-kube-api-access-rbj6p\") pod \"machine-api-operator-5694c8668f-nwwtl\" (UID: \"78eab148-c7db-4caf-99dd-7576fdee2366\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.773680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78eab148-c7db-4caf-99dd-7576fdee2366-images\") pod \"machine-api-operator-5694c8668f-nwwtl\" (UID: \"78eab148-c7db-4caf-99dd-7576fdee2366\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.773697 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6tsd\" (UniqueName: \"kubernetes.io/projected/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-kube-api-access-g6tsd\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.773714 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5321a24-d271-46cb-9d0a-fde8089a6ddc-secret-volume\") pod \"collect-profiles-29322060-6hjrx\" (UID: \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl7r8\" (UniqueName: \"kubernetes.io/projected/87f666c4-4000-49c8-aadb-027878d0833f-kube-api-access-xl7r8\") pod \"service-ca-operator-777779d784-lvwn9\" (UID: \"87f666c4-4000-49c8-aadb-027878d0833f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-mountpoint-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774272 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-trusted-ca-bundle\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8274c03b-8bd5-4bbc-bdbe-9b827ce71190-srv-cert\") pod \"catalog-operator-68c6474976-x7drh\" (UID: \"8274c03b-8bd5-4bbc-bdbe-9b827ce71190\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774312 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4084a94-565b-4599-b786-e1c0ed986450-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4bxbv\" (UID: \"e4084a94-565b-4599-b786-e1c0ed986450\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774337 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwh8q\" (UniqueName: \"kubernetes.io/projected/8274c03b-8bd5-4bbc-bdbe-9b827ce71190-kube-api-access-cwh8q\") pod \"catalog-operator-68c6474976-x7drh\" (UID: \"8274c03b-8bd5-4bbc-bdbe-9b827ce71190\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774353 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/13fabc72-2ab9-42f6-8e89-7868e3dd0228-signing-key\") pod \"service-ca-9c57cc56f-nxdhr\" (UID: \"13fabc72-2ab9-42f6-8e89-7868e3dd0228\") " pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774368 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85af42c2-35fc-4545-8fb2-22ab8beb3e22-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n4kx2\" (UID: \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774390 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jfzr\" (UniqueName: \"kubernetes.io/projected/f9b5edf5-c6f0-48e3-b97c-c26c2939ec50-kube-api-access-2jfzr\") pod \"packageserver-d55dfcdfc-4g84g\" (UID: \"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774404 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-oauth-serving-cert\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774420 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64nc8\" (UniqueName: \"kubernetes.io/projected/2c421ec0-cb43-4116-87a5-9621322f8a33-kube-api-access-64nc8\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-oauth-config\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5321a24-d271-46cb-9d0a-fde8089a6ddc-config-volume\") pod \"collect-profiles-29322060-6hjrx\" (UID: \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774467 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78eab148-c7db-4caf-99dd-7576fdee2366-config\") pod \"machine-api-operator-5694c8668f-nwwtl\" (UID: \"78eab148-c7db-4caf-99dd-7576fdee2366\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774486 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjn2j\" (UniqueName: \"kubernetes.io/projected/c4026781-41c8-4fbc-8350-a1ed556054b0-kube-api-access-mjn2j\") pod \"dns-default-nrjl4\" (UID: \"c4026781-41c8-4fbc-8350-a1ed556054b0\") " pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-service-ca\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774517 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87afcd70-2daf-4b05-a299-982f75f0a5fa-cert\") pod \"ingress-canary-gsm5p\" (UID: \"87afcd70-2daf-4b05-a299-982f75f0a5fa\") " pod="openshift-ingress-canary/ingress-canary-gsm5p" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lghp4\" (UniqueName: \"kubernetes.io/projected/d5321a24-d271-46cb-9d0a-fde8089a6ddc-kube-api-access-lghp4\") pod \"collect-profiles-29322060-6hjrx\" (UID: \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-csi-data-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqqrc\" (UniqueName: \"kubernetes.io/projected/e4084a94-565b-4599-b786-e1c0ed986450-kube-api-access-sqqrc\") pod \"machine-config-controller-84d6567774-4bxbv\" (UID: \"e4084a94-565b-4599-b786-e1c0ed986450\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774585 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30a10511-6a81-4150-9ae3-976a8062accc-config\") pod \"route-controller-manager-6576b87f9c-4v2g5\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774599 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30a10511-6a81-4150-9ae3-976a8062accc-client-ca\") pod \"route-controller-manager-6576b87f9c-4v2g5\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774614 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgmhs\" (UniqueName: \"kubernetes.io/projected/13fabc72-2ab9-42f6-8e89-7868e3dd0228-kube-api-access-tgmhs\") pod \"service-ca-9c57cc56f-nxdhr\" (UID: \"13fabc72-2ab9-42f6-8e89-7868e3dd0228\") " pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6d282991-125e-45a4-b6d7-65c5919cfbfa-images\") pod \"machine-config-operator-74547568cd-jrr4f\" (UID: \"6d282991-125e-45a4-b6d7-65c5919cfbfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774650 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6d282991-125e-45a4-b6d7-65c5919cfbfa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jrr4f\" (UID: \"6d282991-125e-45a4-b6d7-65c5919cfbfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774670 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpwn2\" (UniqueName: \"kubernetes.io/projected/30a10511-6a81-4150-9ae3-976a8062accc-kube-api-access-hpwn2\") pod \"route-controller-manager-6576b87f9c-4v2g5\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774687 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dfbc76ab-f650-4afe-91a5-a03c5577b4f2-certs\") pod \"machine-config-server-g7kvk\" (UID: \"dfbc76ab-f650-4afe-91a5-a03c5577b4f2\") " pod="openshift-machine-config-operator/machine-config-server-g7kvk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774702 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f9b5edf5-c6f0-48e3-b97c-c26c2939ec50-apiservice-cert\") pod \"packageserver-d55dfcdfc-4g84g\" (UID: \"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774708 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-mountpoint-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a10511-6a81-4150-9ae3-976a8062accc-serving-cert\") pod \"route-controller-manager-6576b87f9c-4v2g5\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774756 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-plugins-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85af42c2-35fc-4545-8fb2-22ab8beb3e22-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n4kx2\" (UID: \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e7332bc1-2523-40a2-9a9c-494940d63f5b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gb6ks\" (UID: \"e7332bc1-2523-40a2-9a9c-494940d63f5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gb6ks" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-registration-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774851 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-serving-cert\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87f666c4-4000-49c8-aadb-027878d0833f-serving-cert\") pod \"service-ca-operator-777779d784-lvwn9\" (UID: \"87f666c4-4000-49c8-aadb-027878d0833f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dfbc76ab-f650-4afe-91a5-a03c5577b4f2-node-bootstrap-token\") pod \"machine-config-server-g7kvk\" (UID: \"dfbc76ab-f650-4afe-91a5-a03c5577b4f2\") " pod="openshift-machine-config-operator/machine-config-server-g7kvk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774941 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f9b5edf5-c6f0-48e3-b97c-c26c2939ec50-webhook-cert\") pod \"packageserver-d55dfcdfc-4g84g\" (UID: \"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/78eab148-c7db-4caf-99dd-7576fdee2366-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nwwtl\" (UID: \"78eab148-c7db-4caf-99dd-7576fdee2366\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774989 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-config\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.775013 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4026781-41c8-4fbc-8350-a1ed556054b0-config-volume\") pod \"dns-default-nrjl4\" (UID: \"c4026781-41c8-4fbc-8350-a1ed556054b0\") " pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.775039 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnvft\" (UniqueName: \"kubernetes.io/projected/dfbc76ab-f650-4afe-91a5-a03c5577b4f2-kube-api-access-pnvft\") pod \"machine-config-server-g7kvk\" (UID: \"dfbc76ab-f650-4afe-91a5-a03c5577b4f2\") " pod="openshift-machine-config-operator/machine-config-server-g7kvk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.775056 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f9b5edf5-c6f0-48e3-b97c-c26c2939ec50-tmpfs\") pod \"packageserver-d55dfcdfc-4g84g\" (UID: \"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.775079 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e4084a94-565b-4599-b786-e1c0ed986450-proxy-tls\") pod \"machine-config-controller-84d6567774-4bxbv\" (UID: \"e4084a94-565b-4599-b786-e1c0ed986450\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.775096 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdv2b\" (UniqueName: \"kubernetes.io/projected/e7332bc1-2523-40a2-9a9c-494940d63f5b-kube-api-access-mdv2b\") pod \"multus-admission-controller-857f4d67dd-gb6ks\" (UID: \"e7332bc1-2523-40a2-9a9c-494940d63f5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gb6ks" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.775111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d282991-125e-45a4-b6d7-65c5919cfbfa-proxy-tls\") pod \"machine-config-operator-74547568cd-jrr4f\" (UID: \"6d282991-125e-45a4-b6d7-65c5919cfbfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.775129 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/13fabc72-2ab9-42f6-8e89-7868e3dd0228-signing-cabundle\") pod \"service-ca-9c57cc56f-nxdhr\" (UID: \"13fabc72-2ab9-42f6-8e89-7868e3dd0228\") " pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.775151 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f666c4-4000-49c8-aadb-027878d0833f-config\") pod \"service-ca-operator-777779d784-lvwn9\" (UID: \"87f666c4-4000-49c8-aadb-027878d0833f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.775173 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-socket-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.775190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8274c03b-8bd5-4bbc-bdbe-9b827ce71190-profile-collector-cert\") pod \"catalog-operator-68c6474976-x7drh\" (UID: \"8274c03b-8bd5-4bbc-bdbe-9b827ce71190\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:06 crc kubenswrapper[4749]: E1001 13:08:06.775935 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.275910435 +0000 UTC m=+147.329895324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.777872 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f9b5edf5-c6f0-48e3-b97c-c26c2939ec50-tmpfs\") pod \"packageserver-d55dfcdfc-4g84g\" (UID: \"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.778917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5321a24-d271-46cb-9d0a-fde8089a6ddc-config-volume\") pod \"collect-profiles-29322060-6hjrx\" (UID: \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.778924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78eab148-c7db-4caf-99dd-7576fdee2366-config\") pod \"machine-api-operator-5694c8668f-nwwtl\" (UID: \"78eab148-c7db-4caf-99dd-7576fdee2366\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.774550 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/78eab148-c7db-4caf-99dd-7576fdee2366-images\") pod \"machine-api-operator-5694c8668f-nwwtl\" (UID: \"78eab148-c7db-4caf-99dd-7576fdee2366\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.780534 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a10511-6a81-4150-9ae3-976a8062accc-serving-cert\") pod \"route-controller-manager-6576b87f9c-4v2g5\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.780662 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-csi-data-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.780673 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-socket-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.780720 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-registration-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.780743 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/2c421ec0-cb43-4116-87a5-9621322f8a33-plugins-dir\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.783098 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87f666c4-4000-49c8-aadb-027878d0833f-serving-cert\") pod \"service-ca-operator-777779d784-lvwn9\" (UID: \"87f666c4-4000-49c8-aadb-027878d0833f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.783150 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-oauth-config\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.783807 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/dfbc76ab-f650-4afe-91a5-a03c5577b4f2-node-bootstrap-token\") pod \"machine-config-server-g7kvk\" (UID: \"dfbc76ab-f650-4afe-91a5-a03c5577b4f2\") " pod="openshift-machine-config-operator/machine-config-server-g7kvk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.784346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e4084a94-565b-4599-b786-e1c0ed986450-proxy-tls\") pod \"machine-config-controller-84d6567774-4bxbv\" (UID: \"e4084a94-565b-4599-b786-e1c0ed986450\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.784758 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d282991-125e-45a4-b6d7-65c5919cfbfa-proxy-tls\") pod \"machine-config-operator-74547568cd-jrr4f\" (UID: \"6d282991-125e-45a4-b6d7-65c5919cfbfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.785159 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/78eab148-c7db-4caf-99dd-7576fdee2366-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-nwwtl\" (UID: \"78eab148-c7db-4caf-99dd-7576fdee2366\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.786755 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.790584 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6d282991-125e-45a4-b6d7-65c5919cfbfa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jrr4f\" (UID: \"6d282991-125e-45a4-b6d7-65c5919cfbfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.790642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30a10511-6a81-4150-9ae3-976a8062accc-client-ca\") pod \"route-controller-manager-6576b87f9c-4v2g5\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.790922 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30a10511-6a81-4150-9ae3-976a8062accc-config\") pod \"route-controller-manager-6576b87f9c-4v2g5\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.792703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/13fabc72-2ab9-42f6-8e89-7868e3dd0228-signing-key\") pod \"service-ca-9c57cc56f-nxdhr\" (UID: \"13fabc72-2ab9-42f6-8e89-7868e3dd0228\") " pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.800367 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/13fabc72-2ab9-42f6-8e89-7868e3dd0228-signing-cabundle\") pod \"service-ca-9c57cc56f-nxdhr\" (UID: \"13fabc72-2ab9-42f6-8e89-7868e3dd0228\") " pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.800882 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6d282991-125e-45a4-b6d7-65c5919cfbfa-images\") pod \"machine-config-operator-74547568cd-jrr4f\" (UID: \"6d282991-125e-45a4-b6d7-65c5919cfbfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.801192 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j29wx\" (UniqueName: \"kubernetes.io/projected/fbdc3512-8dd4-4298-b49b-63a7fc87040c-kube-api-access-j29wx\") pod \"etcd-operator-b45778765-ql7n8\" (UID: \"fbdc3512-8dd4-4298-b49b-63a7fc87040c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.801612 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4084a94-565b-4599-b786-e1c0ed986450-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4bxbv\" (UID: \"e4084a94-565b-4599-b786-e1c0ed986450\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.802270 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4026781-41c8-4fbc-8350-a1ed556054b0-metrics-tls\") pod \"dns-default-nrjl4\" (UID: \"c4026781-41c8-4fbc-8350-a1ed556054b0\") " pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.802354 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4026781-41c8-4fbc-8350-a1ed556054b0-config-volume\") pod \"dns-default-nrjl4\" (UID: \"c4026781-41c8-4fbc-8350-a1ed556054b0\") " pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.802838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85af42c2-35fc-4545-8fb2-22ab8beb3e22-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n4kx2\" (UID: \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.803238 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-config\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.803316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-oauth-serving-cert\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.803509 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-service-ca\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.803773 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f666c4-4000-49c8-aadb-027878d0833f-config\") pod \"service-ca-operator-777779d784-lvwn9\" (UID: \"87f666c4-4000-49c8-aadb-027878d0833f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.804594 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8274c03b-8bd5-4bbc-bdbe-9b827ce71190-profile-collector-cert\") pod \"catalog-operator-68c6474976-x7drh\" (UID: \"8274c03b-8bd5-4bbc-bdbe-9b827ce71190\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.804932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8274c03b-8bd5-4bbc-bdbe-9b827ce71190-srv-cert\") pod \"catalog-operator-68c6474976-x7drh\" (UID: \"8274c03b-8bd5-4bbc-bdbe-9b827ce71190\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.806403 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/dfbc76ab-f650-4afe-91a5-a03c5577b4f2-certs\") pod \"machine-config-server-g7kvk\" (UID: \"dfbc76ab-f650-4afe-91a5-a03c5577b4f2\") " pod="openshift-machine-config-operator/machine-config-server-g7kvk" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.806675 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f9b5edf5-c6f0-48e3-b97c-c26c2939ec50-webhook-cert\") pod \"packageserver-d55dfcdfc-4g84g\" (UID: \"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.806758 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7t5q\" (UniqueName: \"kubernetes.io/projected/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-kube-api-access-x7t5q\") pod \"controller-manager-879f6c89f-6smx5\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.806995 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/87afcd70-2daf-4b05-a299-982f75f0a5fa-cert\") pod \"ingress-canary-gsm5p\" (UID: \"87afcd70-2daf-4b05-a299-982f75f0a5fa\") " pod="openshift-ingress-canary/ingress-canary-gsm5p" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.807468 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-trusted-ca-bundle\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.809167 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5321a24-d271-46cb-9d0a-fde8089a6ddc-secret-volume\") pod \"collect-profiles-29322060-6hjrx\" (UID: \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.811933 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85af42c2-35fc-4545-8fb2-22ab8beb3e22-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n4kx2\" (UID: \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.812319 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f9b5edf5-c6f0-48e3-b97c-c26c2939ec50-apiservice-cert\") pod \"packageserver-d55dfcdfc-4g84g\" (UID: \"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.814092 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e7332bc1-2523-40a2-9a9c-494940d63f5b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gb6ks\" (UID: \"e7332bc1-2523-40a2-9a9c-494940d63f5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gb6ks" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.816792 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-serving-cert\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.821290 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frf2s\" (UniqueName: \"kubernetes.io/projected/89ddaa81-0040-4e7c-8207-dc77ca8b888a-kube-api-access-frf2s\") pod \"dns-operator-744455d44c-h699h\" (UID: \"89ddaa81-0040-4e7c-8207-dc77ca8b888a\") " pod="openshift-dns-operator/dns-operator-744455d44c-h699h" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.848962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhz8z\" (UniqueName: \"kubernetes.io/projected/744adf1a-749a-4816-926e-540e6f80acc0-kube-api-access-qhz8z\") pod \"olm-operator-6b444d44fb-4chs2\" (UID: \"744adf1a-749a-4816-926e-540e6f80acc0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.863556 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-w2nbb"] Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.866966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxlfg\" (UniqueName: \"kubernetes.io/projected/1ee9f95c-7f21-46fa-bbf4-fc06601b9e42-kube-api-access-wxlfg\") pod \"openshift-controller-manager-operator-756b6f6bc6-wrtxd\" (UID: \"1ee9f95c-7f21-46fa-bbf4-fc06601b9e42\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.871524 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h699h" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.880302 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: E1001 13:08:06.880712 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.38070063 +0000 UTC m=+147.434685529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.881501 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-bound-sa-token\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.908655 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt"] Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.918952 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqqtd\" (UniqueName: \"kubernetes.io/projected/e7b739c8-2b6d-47da-b5a6-f40a3a1ee298-kube-api-access-qqqtd\") pod \"migrator-59844c95c7-2sc4v\" (UID: \"e7b739c8-2b6d-47da-b5a6-f40a3a1ee298\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2sc4v" Oct 01 13:08:06 crc kubenswrapper[4749]: W1001 13:08:06.921111 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dcec683_7e35_4aea_95da_b43baaf89e03.slice/crio-2233e1060de0d58a680eb198ef998ce774e9998773fe907acada93b5ee8c54ac WatchSource:0}: Error finding container 2233e1060de0d58a680eb198ef998ce774e9998773fe907acada93b5ee8c54ac: Status 404 returned error can't find the container with id 2233e1060de0d58a680eb198ef998ce774e9998773fe907acada93b5ee8c54ac Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.923337 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cp9j\" (UniqueName: \"kubernetes.io/projected/0499d12a-7da9-4ece-9b62-87b0141c91f1-kube-api-access-7cp9j\") pod \"apiserver-7bbb656c7d-t78r4\" (UID: \"0499d12a-7da9-4ece-9b62-87b0141c91f1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.950175 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w627z\" (UniqueName: \"kubernetes.io/projected/97489cec-2ad5-4b5c-89fd-d51ab641c126-kube-api-access-w627z\") pod \"router-default-5444994796-d2qqz\" (UID: \"97489cec-2ad5-4b5c-89fd-d51ab641c126\") " pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.979664 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.981266 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:06 crc kubenswrapper[4749]: E1001 13:08:06.981961 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.481945933 +0000 UTC m=+147.535930822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.982062 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.986051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbj6p\" (UniqueName: \"kubernetes.io/projected/78eab148-c7db-4caf-99dd-7576fdee2366-kube-api-access-rbj6p\") pod \"machine-api-operator-5694c8668f-nwwtl\" (UID: \"78eab148-c7db-4caf-99dd-7576fdee2366\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.993309 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw"] Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.995022 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:06 crc kubenswrapper[4749]: I1001 13:08:06.998698 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.003186 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2sc4v" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.003947 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zghw\" (UniqueName: \"kubernetes.io/projected/87afcd70-2daf-4b05-a299-982f75f0a5fa-kube-api-access-8zghw\") pod \"ingress-canary-gsm5p\" (UID: \"87afcd70-2daf-4b05-a299-982f75f0a5fa\") " pod="openshift-ingress-canary/ingress-canary-gsm5p" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.004623 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.021577 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.023014 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.026549 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.028964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcvmb\" (UniqueName: \"kubernetes.io/projected/6d282991-125e-45a4-b6d7-65c5919cfbfa-kube-api-access-qcvmb\") pod \"machine-config-operator-74547568cd-jrr4f\" (UID: \"6d282991-125e-45a4-b6d7-65c5919cfbfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.029989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.035401 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.053806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdnh7\" (UniqueName: \"kubernetes.io/projected/85af42c2-35fc-4545-8fb2-22ab8beb3e22-kube-api-access-wdnh7\") pod \"marketplace-operator-79b997595-n4kx2\" (UID: \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.063629 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6tsd\" (UniqueName: \"kubernetes.io/projected/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-kube-api-access-g6tsd\") pod \"console-f9d7485db-nsv4j\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.067556 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.072771 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr"] Oct 01 13:08:07 crc kubenswrapper[4749]: W1001 13:08:07.073850 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79c4368a_f145_4cfe_96ed_c1076533fa3a.slice/crio-030c5a6a86c68df81632780fcd58c59d5b88d3f9e8bc644d896872953f6e9fed WatchSource:0}: Error finding container 030c5a6a86c68df81632780fcd58c59d5b88d3f9e8bc644d896872953f6e9fed: Status 404 returned error can't find the container with id 030c5a6a86c68df81632780fcd58c59d5b88d3f9e8bc644d896872953f6e9fed Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.083318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.083828 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.583813354 +0000 UTC m=+147.637798253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.085750 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64nc8\" (UniqueName: \"kubernetes.io/projected/2c421ec0-cb43-4116-87a5-9621322f8a33-kube-api-access-64nc8\") pod \"csi-hostpathplugin-4kcnk\" (UID: \"2c421ec0-cb43-4116-87a5-9621322f8a33\") " pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.092176 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.101727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl7r8\" (UniqueName: \"kubernetes.io/projected/87f666c4-4000-49c8-aadb-027878d0833f-kube-api-access-xl7r8\") pod \"service-ca-operator-777779d784-lvwn9\" (UID: \"87f666c4-4000-49c8-aadb-027878d0833f\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.114334 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.127721 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:07 crc kubenswrapper[4749]: W1001 13:08:07.157011 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280cdbe4_3133_4c44_a483_0be2a86f2f36.slice/crio-41130db6cfb69b026e86133b9f4bc2f6edd561b1b71676f77edd5b98276c3e14 WatchSource:0}: Error finding container 41130db6cfb69b026e86133b9f4bc2f6edd561b1b71676f77edd5b98276c3e14: Status 404 returned error can't find the container with id 41130db6cfb69b026e86133b9f4bc2f6edd561b1b71676f77edd5b98276c3e14 Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.164532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjn2j\" (UniqueName: \"kubernetes.io/projected/c4026781-41c8-4fbc-8350-a1ed556054b0-kube-api-access-mjn2j\") pod \"dns-default-nrjl4\" (UID: \"c4026781-41c8-4fbc-8350-a1ed556054b0\") " pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.166064 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jfzr\" (UniqueName: \"kubernetes.io/projected/f9b5edf5-c6f0-48e3-b97c-c26c2939ec50-kube-api-access-2jfzr\") pod \"packageserver-d55dfcdfc-4g84g\" (UID: \"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.173112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" event={"ID":"698ca3be-58a9-4c4a-88ea-f0f89856cbe4","Type":"ContainerStarted","Data":"ac15bbc732193a45f0cfb55e1b24b2df49d4fd3bb7e2fa2806734b8bfe55be43"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.173164 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" event={"ID":"698ca3be-58a9-4c4a-88ea-f0f89856cbe4","Type":"ContainerStarted","Data":"76a0c2c89c711b54db4e7db93e8a31e0495eddea373461928c2a7395591a77b7"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.176983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xk4rn" event={"ID":"dbf033db-e8ba-43f1-bf88-287c3afc79e7","Type":"ContainerStarted","Data":"d343b437941bdbe96223f0827ef6d235b9aa99d2d7c8b37ea1d897e64aa36db1"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.177016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xk4rn" event={"ID":"dbf033db-e8ba-43f1-bf88-287c3afc79e7","Type":"ContainerStarted","Data":"c33d1363c091ea72e35af890e5aaf281d719e051ccaabca4d30701948ed50b50"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.178016 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.182018 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.182570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2j6m9" event={"ID":"43aae4a1-9504-4c81-9ff1-675c0b51ced2","Type":"ContainerStarted","Data":"0c72304a6ae09f6123553068d08174631cb7543ccf67c0faa69d90100a26ba71"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.182614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2j6m9" event={"ID":"43aae4a1-9504-4c81-9ff1-675c0b51ced2","Type":"ContainerStarted","Data":"0822096de153e3de450aba3146b4e99bd8dbba2e1810fb6df6f6e7f37831f5ac"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.183246 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2j6m9" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.184039 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnvft\" (UniqueName: \"kubernetes.io/projected/dfbc76ab-f650-4afe-91a5-a03c5577b4f2-kube-api-access-pnvft\") pod \"machine-config-server-g7kvk\" (UID: \"dfbc76ab-f650-4afe-91a5-a03c5577b4f2\") " pod="openshift-machine-config-operator/machine-config-server-g7kvk" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.184661 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.184713 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.684700787 +0000 UTC m=+147.738685686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.184999 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwh8q\" (UniqueName: \"kubernetes.io/projected/8274c03b-8bd5-4bbc-bdbe-9b827ce71190-kube-api-access-cwh8q\") pod \"catalog-operator-68c6474976-x7drh\" (UID: \"8274c03b-8bd5-4bbc-bdbe-9b827ce71190\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.186593 4749 patch_prober.go:28] interesting pod/console-operator-58897d9998-xk4rn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.187626 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xk4rn" podUID="dbf033db-e8ba-43f1-bf88-287c3afc79e7" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.187640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.186749 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.188189 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.688177357 +0000 UTC m=+147.742162256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.189103 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-2j6m9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.189138 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2j6m9" podUID="43aae4a1-9504-4c81-9ff1-675c0b51ced2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.191340 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" event={"ID":"5dcec683-7e35-4aea-95da-b43baaf89e03","Type":"ContainerStarted","Data":"2233e1060de0d58a680eb198ef998ce774e9998773fe907acada93b5ee8c54ac"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.194828 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:07 crc kubenswrapper[4749]: W1001 13:08:07.199579 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97489cec_2ad5_4b5c_89fd_d51ab641c126.slice/crio-a69eedfaec77b5beaf242eee4efd41947d1fedd06a2a5f3805efdc710e6f1d15 WatchSource:0}: Error finding container a69eedfaec77b5beaf242eee4efd41947d1fedd06a2a5f3805efdc710e6f1d15: Status 404 returned error can't find the container with id a69eedfaec77b5beaf242eee4efd41947d1fedd06a2a5f3805efdc710e6f1d15 Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.199783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" event={"ID":"1e490c27-453c-4b8b-8a27-f446aee2178b","Type":"ContainerStarted","Data":"04e397cc859ef10872adbb5e481ca422ae1865155a0f26e46e6989960bcc0951"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.200532 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.203211 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.204033 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lghp4\" (UniqueName: \"kubernetes.io/projected/d5321a24-d271-46cb-9d0a-fde8089a6ddc-kube-api-access-lghp4\") pod \"collect-profiles-29322060-6hjrx\" (UID: \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.205077 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h699h"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.207723 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc" event={"ID":"a4a7417c-f1ec-4b88-9f86-8a4da4365429","Type":"ContainerStarted","Data":"e487e055bc5e148c720e82254ca993ecbe3a9eaf3c54c1f8d30b975ddfa8354b"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.209429 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" event={"ID":"dc3a6c0f-3591-44b3-9044-b219d6a69787","Type":"ContainerStarted","Data":"ad46b47406e78545e65f18b36f987d5e25116da0c15a8a2430670b78e6c96ac3"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.213402 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" event={"ID":"a4368a25-e1cf-4ea8-89d3-18d75848a783","Type":"ContainerStarted","Data":"33f30d76bca2f3013b3ced6f3eb437184c614b5344e99b1597db7821b21934fd"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.218052 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" event={"ID":"79c4368a-f145-4cfe-96ed-c1076533fa3a","Type":"ContainerStarted","Data":"030c5a6a86c68df81632780fcd58c59d5b88d3f9e8bc644d896872953f6e9fed"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.223915 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.228481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqqrc\" (UniqueName: \"kubernetes.io/projected/e4084a94-565b-4599-b786-e1c0ed986450-kube-api-access-sqqrc\") pod \"machine-config-controller-84d6567774-4bxbv\" (UID: \"e4084a94-565b-4599-b786-e1c0ed986450\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.231393 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g7kvk" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.237336 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gsm5p" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.248585 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.259115 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpwn2\" (UniqueName: \"kubernetes.io/projected/30a10511-6a81-4150-9ae3-976a8062accc-kube-api-access-hpwn2\") pod \"route-controller-manager-6576b87f9c-4v2g5\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.263156 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgmhs\" (UniqueName: \"kubernetes.io/projected/13fabc72-2ab9-42f6-8e89-7868e3dd0228-kube-api-access-tgmhs\") pod \"service-ca-9c57cc56f-nxdhr\" (UID: \"13fabc72-2ab9-42f6-8e89-7868e3dd0228\") " pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.270579 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" event={"ID":"9346ab8a-f5d2-4d33-be7f-4b7fe5687044","Type":"ContainerStarted","Data":"326c0918695bfc2b7ab2f3f185fa14d059e0d49516273f5bb956cbb090ad66e8"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.274675 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" event={"ID":"a8c57f71-dac4-492c-b4f9-884233468771","Type":"ContainerStarted","Data":"3e0e3d60812e36443c3de70044f0c2623e738d513d3f9651cf9c06ebdce74cdf"} Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.289550 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.289840 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.789797001 +0000 UTC m=+147.843781900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.289968 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.291168 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.79115174 +0000 UTC m=+147.845136739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.298169 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdv2b\" (UniqueName: \"kubernetes.io/projected/e7332bc1-2523-40a2-9a9c-494940d63f5b-kube-api-access-mdv2b\") pod \"multus-admission-controller-857f4d67dd-gb6ks\" (UID: \"e7332bc1-2523-40a2-9a9c-494940d63f5b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gb6ks" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.332600 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x9q82"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.393637 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.396174 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.893830404 +0000 UTC m=+147.947815303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.396347 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.396970 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.896784009 +0000 UTC m=+147.950768908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.422229 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.432999 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-2sc4v"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.435398 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.454396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.461006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.486513 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gb6ks" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.486537 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.497804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.498507 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:07.998489485 +0000 UTC m=+148.052474384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.592769 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-ql7n8"] Oct 01 13:08:07 crc kubenswrapper[4749]: W1001 13:08:07.594180 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7b739c8_2b6d_47da_b5a6_f40a3a1ee298.slice/crio-0da56cca1abb9ea93b7a7da05f90c2fb0edf367f0e91ebe82a9e25e4ca186680 WatchSource:0}: Error finding container 0da56cca1abb9ea93b7a7da05f90c2fb0edf367f0e91ebe82a9e25e4ca186680: Status 404 returned error can't find the container with id 0da56cca1abb9ea93b7a7da05f90c2fb0edf367f0e91ebe82a9e25e4ca186680 Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.600571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.600958 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:08.100931773 +0000 UTC m=+148.154916672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.675926 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.697740 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.701988 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.702162 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:08.202121954 +0000 UTC m=+148.256106853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.702242 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.702616 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:08.202604778 +0000 UTC m=+148.256589677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.718906 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps"] Oct 01 13:08:07 crc kubenswrapper[4749]: W1001 13:08:07.753034 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfbc76ab_f650_4afe_91a5_a03c5577b4f2.slice/crio-665f3d9d9a536fc18cdbe021d40bb968e2fd63574f211be1733d5b357c157eb0 WatchSource:0}: Error finding container 665f3d9d9a536fc18cdbe021d40bb968e2fd63574f211be1733d5b357c157eb0: Status 404 returned error can't find the container with id 665f3d9d9a536fc18cdbe021d40bb968e2fd63574f211be1733d5b357c157eb0 Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.803081 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.803681 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:08.303538302 +0000 UTC m=+148.357523201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.889553 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nsv4j"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.904426 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:07 crc kubenswrapper[4749]: E1001 13:08:07.904714 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:08.404703813 +0000 UTC m=+148.458688712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.909239 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.927413 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh"] Oct 01 13:08:07 crc kubenswrapper[4749]: I1001 13:08:07.971837 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-nwwtl"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.005612 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:08 crc kubenswrapper[4749]: E1001 13:08:08.005898 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:08.505883844 +0000 UTC m=+148.559868743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.042875 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6smx5"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.044500 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd"] Oct 01 13:08:08 crc kubenswrapper[4749]: W1001 13:08:08.103767 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8274c03b_8bd5_4bbc_bdbe_9b827ce71190.slice/crio-14d945a21f93a6172472a0a716a504f7134b2c90807119dc4507309c6f3d3866 WatchSource:0}: Error finding container 14d945a21f93a6172472a0a716a504f7134b2c90807119dc4507309c6f3d3866: Status 404 returned error can't find the container with id 14d945a21f93a6172472a0a716a504f7134b2c90807119dc4507309c6f3d3866 Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.106544 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:08 crc kubenswrapper[4749]: E1001 13:08:08.106954 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:08.606923061 +0000 UTC m=+148.660907960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.207322 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:08 crc kubenswrapper[4749]: E1001 13:08:08.207785 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:08.707770303 +0000 UTC m=+148.761755202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.276053 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.299691 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gsm5p"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.308843 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" event={"ID":"1e490c27-453c-4b8b-8a27-f446aee2178b","Type":"ContainerStarted","Data":"6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.309839 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.310367 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:08 crc kubenswrapper[4749]: E1001 13:08:08.310674 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:08.810663043 +0000 UTC m=+148.864647942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.313955 4749 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-4zj5j container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" start-of-body= Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.314412 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" podUID="1e490c27-453c-4b8b-8a27-f446aee2178b" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.12:6443/healthz\": dial tcp 10.217.0.12:6443: connect: connection refused" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.333171 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps" event={"ID":"31175322-99a0-4224-82d9-ca63e5a241c8","Type":"ContainerStarted","Data":"c3be14f48e36aa6f7b87a703ed370b255562b696940cd17e924e174d09d1b9e3"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.346033 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.355128 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4kx2"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.358193 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" event={"ID":"8274c03b-8bd5-4bbc-bdbe-9b827ce71190","Type":"ContainerStarted","Data":"14d945a21f93a6172472a0a716a504f7134b2c90807119dc4507309c6f3d3866"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.361148 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nrjl4"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.364073 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" event={"ID":"87f666c4-4000-49c8-aadb-027878d0833f","Type":"ContainerStarted","Data":"2f22a452c873c2bdb85f02c088f29060c801aad99ccf922ebb90421ab8ffca1a"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.376919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" event={"ID":"7daba5cb-4db9-41c4-b8a7-3a865bb69776","Type":"ContainerStarted","Data":"c5edb6fb34b8188af77e69a48343f3ec8ade22b8cd7881f7c5c0a1f0a6e23401"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.376954 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" event={"ID":"7daba5cb-4db9-41c4-b8a7-3a865bb69776","Type":"ContainerStarted","Data":"0aaecab8a79905042b5758b453d9f5ac4f1b8c003b6241e7368da332b9be9ac1"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.386411 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2j6m9" podStartSLOduration=124.386400332 podStartE2EDuration="2m4.386400332s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:08.384494827 +0000 UTC m=+148.438479716" watchObservedRunningTime="2025-10-01 13:08:08.386400332 +0000 UTC m=+148.440385231" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.397514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-d2qqz" event={"ID":"97489cec-2ad5-4b5c-89fd-d51ab641c126","Type":"ContainerStarted","Data":"10b960a42554aa2a1b767bc94a50253b0e26dba4ca7728f7f4a99d5dae32b841"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.397562 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-d2qqz" event={"ID":"97489cec-2ad5-4b5c-89fd-d51ab641c126","Type":"ContainerStarted","Data":"a69eedfaec77b5beaf242eee4efd41947d1fedd06a2a5f3805efdc710e6f1d15"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.404903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2sc4v" event={"ID":"e7b739c8-2b6d-47da-b5a6-f40a3a1ee298","Type":"ContainerStarted","Data":"0da56cca1abb9ea93b7a7da05f90c2fb0edf367f0e91ebe82a9e25e4ca186680"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.411116 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" event={"ID":"0499d12a-7da9-4ece-9b62-87b0141c91f1","Type":"ContainerStarted","Data":"e2a7d16a63fd2635d14fd5dfb3380e7dd392727bf71088d4a54babe7f6d30797"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.411629 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:08 crc kubenswrapper[4749]: E1001 13:08:08.412083 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:08.91205476 +0000 UTC m=+148.966039659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:08 crc kubenswrapper[4749]: W1001 13:08:08.412723 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d282991_125e_45a4_b6d7_65c5919cfbfa.slice/crio-9b740d26a6865d092a71de5a62f9f3fddbe6a2a707d2d49b9385920e7db6247b WatchSource:0}: Error finding container 9b740d26a6865d092a71de5a62f9f3fddbe6a2a707d2d49b9385920e7db6247b: Status 404 returned error can't find the container with id 9b740d26a6865d092a71de5a62f9f3fddbe6a2a707d2d49b9385920e7db6247b Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.417334 4749 generic.go:334] "Generic (PLEG): container finished" podID="c560bc26-ad30-401e-823c-66e9b1a999a1" containerID="abee0b3bb6d7c1911784d05424ddbfb56d8e7f063a0a123abc002487152fa23b" exitCode=0 Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.417399 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" event={"ID":"c560bc26-ad30-401e-823c-66e9b1a999a1","Type":"ContainerDied","Data":"abee0b3bb6d7c1911784d05424ddbfb56d8e7f063a0a123abc002487152fa23b"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.417430 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" event={"ID":"c560bc26-ad30-401e-823c-66e9b1a999a1","Type":"ContainerStarted","Data":"2aa1116b705437c42bd8eddcaa2eb20856b15d83c13ae9664e6339d29bf29d48"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.418103 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4kcnk"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.442910 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" event={"ID":"a4368a25-e1cf-4ea8-89d3-18d75848a783","Type":"ContainerStarted","Data":"290c89f831b52228e02ae4232c6e2eda3a38901f11d49637ec5843190fb962cc"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.457545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" event={"ID":"1980f6f8-707b-4dd0-955e-156b0e1598e3","Type":"ContainerStarted","Data":"b0cc92758765d329094416caa047d8b7243a7e7dadc0485a8b0a329a4aaf076a"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.457587 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" event={"ID":"1980f6f8-707b-4dd0-955e-156b0e1598e3","Type":"ContainerStarted","Data":"64db406b88319b132f1dae91cfd6635e8a1377dc649e6f0ad457cb76c53ffc33"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.481337 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h699h" event={"ID":"89ddaa81-0040-4e7c-8207-dc77ca8b888a","Type":"ContainerStarted","Data":"300bb8e4e7da86a82cac681de9131adfa45810126bf2e987a8575b3f1a5b4d33"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.490349 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" event={"ID":"5dcec683-7e35-4aea-95da-b43baaf89e03","Type":"ContainerStarted","Data":"12765ab22c2070481430290524fe46c5e27447b6f48f35f8048987f47d838ddd"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.501987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" event={"ID":"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565","Type":"ContainerStarted","Data":"cf00cee0fd01789c717b276ef4831cdc2a24d61d9d0dae82775f7cb481f0a48e"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.516609 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.521957 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" event={"ID":"fbdc3512-8dd4-4298-b49b-63a7fc87040c","Type":"ContainerStarted","Data":"aabfccf81a64ff0c4a0e983110b2ef7c8aba1e22eb2e5a0a76840657a284484a"} Oct 01 13:08:08 crc kubenswrapper[4749]: E1001 13:08:08.523484 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.023469526 +0000 UTC m=+149.077454425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.531502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" event={"ID":"79c4368a-f145-4cfe-96ed-c1076533fa3a","Type":"ContainerStarted","Data":"1fc03c172274a4160b79391c52e29464de2114169debfaa431f6cb8732bb3766"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.532763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" event={"ID":"9346ab8a-f5d2-4d33-be7f-4b7fe5687044","Type":"ContainerStarted","Data":"b8e14445f6b845354854ce4ee08d00a5a05ebe56701fb3469e3cc16c60589d1c"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.538077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" event={"ID":"dc3a6c0f-3591-44b3-9044-b219d6a69787","Type":"ContainerStarted","Data":"9a5afe4c85677bd19e17435649e0baabf9c4194c9fb8e5ce1511c6fbf3a30b9e"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.563300 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" event={"ID":"280cdbe4-3133-4c44-a483-0be2a86f2f36","Type":"ContainerStarted","Data":"41130db6cfb69b026e86133b9f4bc2f6edd561b1b71676f77edd5b98276c3e14"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.587132 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" event={"ID":"744adf1a-749a-4816-926e-540e6f80acc0","Type":"ContainerStarted","Data":"b8fa32e15f97476eb1d41d1f0ca2b12f811f1da20e754fb80cef86080b52197a"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.592137 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" event={"ID":"1ee9f95c-7f21-46fa-bbf4-fc06601b9e42","Type":"ContainerStarted","Data":"a8fe549ecd48dbd58468249d1e2f57beaba8d2090e914dcfad6dab26db36c75b"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.609939 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x9q82" event={"ID":"1d30b63f-8173-4be0-a523-f69f604cb48a","Type":"ContainerStarted","Data":"0570cfc5fd0864ff239a497a37b271f4edeedb7e494bc6c96af5e19847dd37f5"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.610679 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5mk4q" podStartSLOduration=124.610651365 podStartE2EDuration="2m4.610651365s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:08.585865331 +0000 UTC m=+148.639850230" watchObservedRunningTime="2025-10-01 13:08:08.610651365 +0000 UTC m=+148.664636284" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.617627 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:08 crc kubenswrapper[4749]: E1001 13:08:08.617951 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.117937044 +0000 UTC m=+149.171921943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.626334 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.630240 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g7kvk" event={"ID":"dfbc76ab-f650-4afe-91a5-a03c5577b4f2","Type":"ContainerStarted","Data":"665f3d9d9a536fc18cdbe021d40bb968e2fd63574f211be1733d5b357c157eb0"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.642718 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nxdhr"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.651341 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc" event={"ID":"a4a7417c-f1ec-4b88-9f86-8a4da4365429","Type":"ContainerStarted","Data":"6fe9a676675cdb1c87ef74e18187867e5e8006ce81d7f9239419a73f80924681"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.651376 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc" event={"ID":"a4a7417c-f1ec-4b88-9f86-8a4da4365429","Type":"ContainerStarted","Data":"7492fe035b2f1c0047d06c4179e5f4e23536d105038f6a799569a87f41ce136d"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.659823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" event={"ID":"78eab148-c7db-4caf-99dd-7576fdee2366","Type":"ContainerStarted","Data":"f7e833d4366a94579cb59464f0dfc19ff6b842347d10fdf915f9bcba4cefdabb"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.665995 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nsv4j" event={"ID":"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d","Type":"ContainerStarted","Data":"adfa3a1ffa1141b238ffa865a22485ef5737c10a702ffbf8e058453175350053"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.682912 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" event={"ID":"a8c57f71-dac4-492c-b4f9-884233468771","Type":"ContainerStarted","Data":"11311e428d74eb28e51b8eadf868243e468e4ab2fd33311094dbce806ddccca8"} Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.689588 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-2j6m9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.689630 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2j6m9" podUID="43aae4a1-9504-4c81-9ff1-675c0b51ced2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.718259 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.719058 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:08 crc kubenswrapper[4749]: E1001 13:08:08.720704 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.220692021 +0000 UTC m=+149.274676920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.722931 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.760499 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-xk4rn" podStartSLOduration=124.760485256 podStartE2EDuration="2m4.760485256s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:08.718624871 +0000 UTC m=+148.772609810" watchObservedRunningTime="2025-10-01 13:08:08.760485256 +0000 UTC m=+148.814470155" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.820328 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:08 crc kubenswrapper[4749]: E1001 13:08:08.820537 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.320517653 +0000 UTC m=+149.374502552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.820673 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:08 crc kubenswrapper[4749]: E1001 13:08:08.822812 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.322797339 +0000 UTC m=+149.376782238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.881578 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" podStartSLOduration=124.881556879 podStartE2EDuration="2m4.881556879s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:08.824290472 +0000 UTC m=+148.878275371" watchObservedRunningTime="2025-10-01 13:08:08.881556879 +0000 UTC m=+148.935541778" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.908231 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8lpdc" podStartSLOduration=124.908200866 podStartE2EDuration="2m4.908200866s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:08.864488908 +0000 UTC m=+148.918473807" watchObservedRunningTime="2025-10-01 13:08:08.908200866 +0000 UTC m=+148.962185765" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.921957 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:08 crc kubenswrapper[4749]: E1001 13:08:08.922273 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.42225953 +0000 UTC m=+149.476244419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.946258 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h64lc" podStartSLOduration=124.94624223 podStartE2EDuration="2m4.94624223s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:08.90695919 +0000 UTC m=+148.960944089" watchObservedRunningTime="2025-10-01 13:08:08.94624223 +0000 UTC m=+149.000227129" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.946383 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" podStartSLOduration=124.946380354 podStartE2EDuration="2m4.946380354s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:08.945413907 +0000 UTC m=+148.999398816" watchObservedRunningTime="2025-10-01 13:08:08.946380354 +0000 UTC m=+149.000365253" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.978762 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gb6ks"] Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.984377 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.995051 4749 patch_prober.go:28] interesting pod/router-default-5444994796-d2qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:08:08 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Oct 01 13:08:08 crc kubenswrapper[4749]: [+]process-running ok Oct 01 13:08:08 crc kubenswrapper[4749]: healthz check failed Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.995101 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-d2qqz" podUID="97489cec-2ad5-4b5c-89fd-d51ab641c126" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:08:08 crc kubenswrapper[4749]: I1001 13:08:08.997207 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-w2nbb" podStartSLOduration=124.997189236 podStartE2EDuration="2m4.997189236s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:08.995517498 +0000 UTC m=+149.049502397" watchObservedRunningTime="2025-10-01 13:08:08.997189236 +0000 UTC m=+149.051174135" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.025091 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.025454 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.525441039 +0000 UTC m=+149.579425938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.080057 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ntsgr" podStartSLOduration=125.08002979 podStartE2EDuration="2m5.08002979s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:09.079607558 +0000 UTC m=+149.133592457" watchObservedRunningTime="2025-10-01 13:08:09.08002979 +0000 UTC m=+149.134014689" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.124277 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-d2pbw" podStartSLOduration=125.124258842 podStartE2EDuration="2m5.124258842s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:09.122758049 +0000 UTC m=+149.176742948" watchObservedRunningTime="2025-10-01 13:08:09.124258842 +0000 UTC m=+149.178243741" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.126776 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.127065 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.627048743 +0000 UTC m=+149.681033642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.219935 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-d2qqz" podStartSLOduration=125.219901954 podStartE2EDuration="2m5.219901954s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:09.185292869 +0000 UTC m=+149.239277758" watchObservedRunningTime="2025-10-01 13:08:09.219901954 +0000 UTC m=+149.273886853" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.231734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.232564 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.732550788 +0000 UTC m=+149.786535677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.335700 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.335866 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.83583535 +0000 UTC m=+149.889820259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.336180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.336550 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.83653677 +0000 UTC m=+149.890521659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.384486 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-xk4rn" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.413824 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mb8kf" podStartSLOduration=125.413805943 podStartE2EDuration="2m5.413805943s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:09.268668427 +0000 UTC m=+149.322653326" watchObservedRunningTime="2025-10-01 13:08:09.413805943 +0000 UTC m=+149.467790862" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.437648 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.437870 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.937835355 +0000 UTC m=+149.991820244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.438027 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.438389 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:09.93837592 +0000 UTC m=+149.992360819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.539317 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.539818 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.039803999 +0000 UTC m=+150.093788888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.648001 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.648334 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.148319261 +0000 UTC m=+150.202304160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.707959 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" event={"ID":"6d282991-125e-45a4-b6d7-65c5919cfbfa","Type":"ContainerStarted","Data":"cdb6984fbb2b287d01d4213b15fdd94fa694e414f2f1be8d992c7e0a4f64d705"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.708361 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" event={"ID":"6d282991-125e-45a4-b6d7-65c5919cfbfa","Type":"ContainerStarted","Data":"9b740d26a6865d092a71de5a62f9f3fddbe6a2a707d2d49b9385920e7db6247b"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.709663 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" event={"ID":"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50","Type":"ContainerStarted","Data":"bb31401924a947050c0611af063611744ca7aefeff8135896d4f1249e1accb01"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.709695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" event={"ID":"f9b5edf5-c6f0-48e3-b97c-c26c2939ec50","Type":"ContainerStarted","Data":"05085d6ec58b2e7e2fdf692178a593633e4bbbfdbe49eceb82a3f1ab07a0ea68"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.713273 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.715947 4749 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4g84g container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.716012 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" podUID="f9b5edf5-c6f0-48e3-b97c-c26c2939ec50" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.718288 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps" event={"ID":"31175322-99a0-4224-82d9-ca63e5a241c8","Type":"ContainerStarted","Data":"7e3579eb7ed78c273cbf582abb9d197941a3edf738ca41cc415db349d69f2ecf"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.720723 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gsm5p" event={"ID":"87afcd70-2daf-4b05-a299-982f75f0a5fa","Type":"ContainerStarted","Data":"3f60e087ce18879c3655c5b9a139a10b24b6ab7f97358c947c5008e64c2d27e6"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.720754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gsm5p" event={"ID":"87afcd70-2daf-4b05-a299-982f75f0a5fa","Type":"ContainerStarted","Data":"dea1675375f8aab47d7b9e0371c3e84357829e60a7ac819c21bbfafb6a2ed386"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.722813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" event={"ID":"2c421ec0-cb43-4116-87a5-9621322f8a33","Type":"ContainerStarted","Data":"d7b42a0aa7748fea066d655d58cf9429b0c7f92193a32375efd16670c771d8db"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.724493 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" event={"ID":"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565","Type":"ContainerStarted","Data":"03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.725103 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.730056 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" podStartSLOduration=124.730042752 podStartE2EDuration="2m4.730042752s" podCreationTimestamp="2025-10-01 13:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:09.729379273 +0000 UTC m=+149.783364172" watchObservedRunningTime="2025-10-01 13:08:09.730042752 +0000 UTC m=+149.784027651" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.731523 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6smx5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.731573 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" podUID="e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.745631 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h699h" event={"ID":"89ddaa81-0040-4e7c-8207-dc77ca8b888a","Type":"ContainerStarted","Data":"68a6f7344c39e92fab6436c90ecf75b204ed4864b284ce180287676a00398919"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.748700 4749 generic.go:334] "Generic (PLEG): container finished" podID="1d30b63f-8173-4be0-a523-f69f604cb48a" containerID="5e4a4d57bbc33fc0056cc299f40b729e72f882b3a46ecb15ee699e129112c0c3" exitCode=0 Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.748805 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.749029 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x9q82" event={"ID":"1d30b63f-8173-4be0-a523-f69f604cb48a","Type":"ContainerDied","Data":"5e4a4d57bbc33fc0056cc299f40b729e72f882b3a46ecb15ee699e129112c0c3"} Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.749137 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.249119011 +0000 UTC m=+150.303103910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.751324 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.752942 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.252929501 +0000 UTC m=+150.306914400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.757712 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gsm5p" podStartSLOduration=5.757695568 podStartE2EDuration="5.757695568s" podCreationTimestamp="2025-10-01 13:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:09.751562471 +0000 UTC m=+149.805547370" watchObservedRunningTime="2025-10-01 13:08:09.757695568 +0000 UTC m=+149.811680467" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.767517 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" event={"ID":"13fabc72-2ab9-42f6-8e89-7868e3dd0228","Type":"ContainerStarted","Data":"8a0373e84a1d9e876fe863e130033bd871b57b420d325a7ac6516cc33766f00f"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.786602 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nsv4j" event={"ID":"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d","Type":"ContainerStarted","Data":"9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.789323 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gb6ks" event={"ID":"e7332bc1-2523-40a2-9a9c-494940d63f5b","Type":"ContainerStarted","Data":"debbf2a02a5f6b86d8f92ed83a13f783b23648a57f9910d8720c1ee4dea85dbe"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.790410 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" event={"ID":"85af42c2-35fc-4545-8fb2-22ab8beb3e22","Type":"ContainerStarted","Data":"3bb94c31a6861b3dccacd1c1666dded2341d1921b17ee74d1c61c33914f608e6"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.790432 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" event={"ID":"85af42c2-35fc-4545-8fb2-22ab8beb3e22","Type":"ContainerStarted","Data":"b3046022cd5326043e2ec1d27a0b7081429959bc1fd0aab4bd17b5bf4925ce57"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.790871 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.792021 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-n4kx2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.792055 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" podUID="85af42c2-35fc-4545-8fb2-22ab8beb3e22" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.792719 4749 generic.go:334] "Generic (PLEG): container finished" podID="0499d12a-7da9-4ece-9b62-87b0141c91f1" containerID="3c803cc54d8491cc542da03044d7b57404f137c595e3b20f3d9bb3212050d5ed" exitCode=0 Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.792772 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" event={"ID":"0499d12a-7da9-4ece-9b62-87b0141c91f1","Type":"ContainerDied","Data":"3c803cc54d8491cc542da03044d7b57404f137c595e3b20f3d9bb3212050d5ed"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.796951 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" event={"ID":"78eab148-c7db-4caf-99dd-7576fdee2366","Type":"ContainerStarted","Data":"d4b377815ff242aa6745656792189aee361b399313aff4e0cf3c0053c254ecd7"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.806903 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" event={"ID":"d5321a24-d271-46cb-9d0a-fde8089a6ddc","Type":"ContainerStarted","Data":"d09e4abe7c9b199e38017b52d57c783aac5ad187ac71bb789a2d4d0f1d648826"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.806989 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" event={"ID":"d5321a24-d271-46cb-9d0a-fde8089a6ddc","Type":"ContainerStarted","Data":"f28b89cca67bafeadbc385d6ed27b43692efc06ffdb95749c34a40e139cc00b7"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.823616 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" event={"ID":"fbdc3512-8dd4-4298-b49b-63a7fc87040c","Type":"ContainerStarted","Data":"be4e5ef241ec215a6b333ad4c1354900c4b5c9d064b517cc720024161c884b99"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.824208 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8twps" podStartSLOduration=124.82416975 podStartE2EDuration="2m4.82416975s" podCreationTimestamp="2025-10-01 13:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:09.821689119 +0000 UTC m=+149.875674018" watchObservedRunningTime="2025-10-01 13:08:09.82416975 +0000 UTC m=+149.878154669" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.842740 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" podStartSLOduration=125.842709784 podStartE2EDuration="2m5.842709784s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:09.84117704 +0000 UTC m=+149.895161949" watchObservedRunningTime="2025-10-01 13:08:09.842709784 +0000 UTC m=+149.896694673" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.845157 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" event={"ID":"744adf1a-749a-4816-926e-540e6f80acc0","Type":"ContainerStarted","Data":"4c16dc9f8daf749227b085aa91dbe5693b09b47dcac6d687ebd4e05085988d0b"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.846144 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.849347 4749 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-4chs2 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.849392 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" podUID="744adf1a-749a-4816-926e-540e6f80acc0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.851799 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.853935 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.353915966 +0000 UTC m=+150.407900865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.861649 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" event={"ID":"87f666c4-4000-49c8-aadb-027878d0833f","Type":"ContainerStarted","Data":"aff76fe97af718e2a3f0fe2dfe653eff7c137bcce4ccbe5f878b6db9dfee66d7"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.871136 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" event={"ID":"dc3a6c0f-3591-44b3-9044-b219d6a69787","Type":"ContainerStarted","Data":"255cf50959f25013258bd784b1a0f9a8958251cb600f3dd60823dbcb0b0acd16"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.886629 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" event={"ID":"5dcec683-7e35-4aea-95da-b43baaf89e03","Type":"ContainerStarted","Data":"f4e026cf75eb021f2795a6364d799a609758d61daf97502c5aba4c5e1e99bfc8"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.902684 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nsv4j" podStartSLOduration=125.902669109 podStartE2EDuration="2m5.902669109s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:09.877011321 +0000 UTC m=+149.930996220" watchObservedRunningTime="2025-10-01 13:08:09.902669109 +0000 UTC m=+149.956653998" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.906907 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2sc4v" event={"ID":"e7b739c8-2b6d-47da-b5a6-f40a3a1ee298","Type":"ContainerStarted","Data":"44e3207eb94cbe2548d2a77e0452a0db5d5c452a9951fdc33d1efdd9593240df"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.927519 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" podStartSLOduration=124.927500693 podStartE2EDuration="2m4.927500693s" podCreationTimestamp="2025-10-01 13:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:09.905108469 +0000 UTC m=+149.959093368" watchObservedRunningTime="2025-10-01 13:08:09.927500693 +0000 UTC m=+149.981485592" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.936937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" event={"ID":"8274c03b-8bd5-4bbc-bdbe-9b827ce71190","Type":"ContainerStarted","Data":"7fefc63271e7678e93e34206c4d67a4668065615b12344106be1e5d1e0b3a56f"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.937096 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.938199 4749 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-x7drh container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.938304 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" podUID="8274c03b-8bd5-4bbc-bdbe-9b827ce71190" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.946583 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" event={"ID":"e4084a94-565b-4599-b786-e1c0ed986450","Type":"ContainerStarted","Data":"b0497b84977bfd8e461620e3a4548dd28f10fa4e55931f1aaaab6ab38cd66e87"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.946618 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" event={"ID":"e4084a94-565b-4599-b786-e1c0ed986450","Type":"ContainerStarted","Data":"7bca240b68dc05da008e89416b72d850931cc63cc3597341ba7daa771951ca59"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.947323 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" podStartSLOduration=124.947305713 podStartE2EDuration="2m4.947305713s" podCreationTimestamp="2025-10-01 13:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:09.943340329 +0000 UTC m=+149.997325228" watchObservedRunningTime="2025-10-01 13:08:09.947305713 +0000 UTC m=+150.001290612" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.953939 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:09 crc kubenswrapper[4749]: E1001 13:08:09.960890 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.460864053 +0000 UTC m=+150.514848952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.970556 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" event={"ID":"79c4368a-f145-4cfe-96ed-c1076533fa3a","Type":"ContainerStarted","Data":"619241b0b8d48c7ff3f8418d3af0662c632a6d18c0ca75c70353ca82af60b7ad"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.970821 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.974541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g7kvk" event={"ID":"dfbc76ab-f650-4afe-91a5-a03c5577b4f2","Type":"ContainerStarted","Data":"a3294582ded46ce2bdaef350bed211d762980d45d45925c0f388a6c7d2f484b3"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.988454 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" event={"ID":"1ee9f95c-7f21-46fa-bbf4-fc06601b9e42","Type":"ContainerStarted","Data":"9ff954a04563d153ade3060dfab2b868c9db2424c5d72cca987c5ab8f5db4b84"} Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.989759 4749 patch_prober.go:28] interesting pod/router-default-5444994796-d2qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:08:09 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Oct 01 13:08:09 crc kubenswrapper[4749]: [+]process-running ok Oct 01 13:08:09 crc kubenswrapper[4749]: healthz check failed Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.989793 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-d2qqz" podUID="97489cec-2ad5-4b5c-89fd-d51ab641c126" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:08:09 crc kubenswrapper[4749]: I1001 13:08:09.997537 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-ql7n8" podStartSLOduration=125.997516938 podStartE2EDuration="2m5.997516938s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:09.995886681 +0000 UTC m=+150.049871590" watchObservedRunningTime="2025-10-01 13:08:09.997516938 +0000 UTC m=+150.051501837" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.016319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-kf4bh" event={"ID":"280cdbe4-3133-4c44-a483-0be2a86f2f36","Type":"ContainerStarted","Data":"b842d9178ad631784d5dbf002e75559c22af76f22812bf9aa7bdc978c56a3c78"} Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.031812 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" event={"ID":"30a10511-6a81-4150-9ae3-976a8062accc","Type":"ContainerStarted","Data":"fc802df61ae6788dc94c50aaf861131e56a6bb6b0668429b62e7616db74be6b6"} Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.032359 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.036271 4749 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-4v2g5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.036315 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" podUID="30a10511-6a81-4150-9ae3-976a8062accc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.055012 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" podStartSLOduration=126.054996652 podStartE2EDuration="2m6.054996652s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.030044164 +0000 UTC m=+150.084029063" watchObservedRunningTime="2025-10-01 13:08:10.054996652 +0000 UTC m=+150.108981551" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.060745 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" event={"ID":"c560bc26-ad30-401e-823c-66e9b1a999a1","Type":"ContainerStarted","Data":"2d064a2795f72c63af8fd2414ccbccce0dc53f02bbc9a04e19cf6e5e512d0fba"} Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.060847 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.072635 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.073306 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.573290318 +0000 UTC m=+150.627275217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.082186 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" podStartSLOduration=125.082172624 podStartE2EDuration="2m5.082172624s" podCreationTimestamp="2025-10-01 13:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.07959677 +0000 UTC m=+150.133581669" watchObservedRunningTime="2025-10-01 13:08:10.082172624 +0000 UTC m=+150.136157523" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.089131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nrjl4" event={"ID":"c4026781-41c8-4fbc-8350-a1ed556054b0","Type":"ContainerStarted","Data":"319de1fba7c50aad81092eb7c5a92814639f6eee86994bc9a933c15db7557413"} Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.089173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nrjl4" event={"ID":"c4026781-41c8-4fbc-8350-a1ed556054b0","Type":"ContainerStarted","Data":"aa63f126b6f73b8eb37a1aa2d8a6a5d1083509ad6b6ef91b3529504f69055523"} Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.091486 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-2j6m9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.091518 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2j6m9" podUID="43aae4a1-9504-4c81-9ff1-675c0b51ced2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.101599 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.105758 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-g7kvk" podStartSLOduration=6.105740282 podStartE2EDuration="6.105740282s" podCreationTimestamp="2025-10-01 13:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.099249435 +0000 UTC m=+150.153234334" watchObservedRunningTime="2025-10-01 13:08:10.105740282 +0000 UTC m=+150.159725181" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.117631 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-ztkh5" podStartSLOduration=126.117617044 podStartE2EDuration="2m6.117617044s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.117264494 +0000 UTC m=+150.171249393" watchObservedRunningTime="2025-10-01 13:08:10.117617044 +0000 UTC m=+150.171601943" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.159749 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" podStartSLOduration=126.159728205 podStartE2EDuration="2m6.159728205s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.140603545 +0000 UTC m=+150.194588444" watchObservedRunningTime="2025-10-01 13:08:10.159728205 +0000 UTC m=+150.213713094" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.174197 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.177763 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.677750814 +0000 UTC m=+150.731735703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.181742 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" podStartSLOduration=126.181726948 podStartE2EDuration="2m6.181726948s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.158725676 +0000 UTC m=+150.212710575" watchObservedRunningTime="2025-10-01 13:08:10.181726948 +0000 UTC m=+150.235711847" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.183171 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" podStartSLOduration=125.18316599 podStartE2EDuration="2m5.18316599s" podCreationTimestamp="2025-10-01 13:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.180822252 +0000 UTC m=+150.234807151" watchObservedRunningTime="2025-10-01 13:08:10.18316599 +0000 UTC m=+150.237150889" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.210305 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7bfdt" podStartSLOduration=126.21028971 podStartE2EDuration="2m6.21028971s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.208676314 +0000 UTC m=+150.262661213" watchObservedRunningTime="2025-10-01 13:08:10.21028971 +0000 UTC m=+150.264274619" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.230309 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2sc4v" podStartSLOduration=126.230288906 podStartE2EDuration="2m6.230288906s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.229468642 +0000 UTC m=+150.283453541" watchObservedRunningTime="2025-10-01 13:08:10.230288906 +0000 UTC m=+150.284273805" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.252078 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" podStartSLOduration=126.252059512 podStartE2EDuration="2m6.252059512s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.250300611 +0000 UTC m=+150.304285510" watchObservedRunningTime="2025-10-01 13:08:10.252059512 +0000 UTC m=+150.306044411" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.275081 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.275202 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.775177317 +0000 UTC m=+150.829162216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.275451 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.275712 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.775699722 +0000 UTC m=+150.829684621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.280422 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wrtxd" podStartSLOduration=126.280404138 podStartE2EDuration="2m6.280404138s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.278110351 +0000 UTC m=+150.332095250" watchObservedRunningTime="2025-10-01 13:08:10.280404138 +0000 UTC m=+150.334389037" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.297111 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvwn9" podStartSLOduration=125.297097408 podStartE2EDuration="2m5.297097408s" podCreationTimestamp="2025-10-01 13:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.295834031 +0000 UTC m=+150.349818930" watchObservedRunningTime="2025-10-01 13:08:10.297097408 +0000 UTC m=+150.351082307" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.377101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.377369 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.877327255 +0000 UTC m=+150.931312154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.377808 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.378123 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.878110848 +0000 UTC m=+150.932095747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.488660 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.488975 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:10.988959127 +0000 UTC m=+151.042944026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.589868 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.590305 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:11.090267802 +0000 UTC m=+151.144252701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.691628 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.691795 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:11.191764042 +0000 UTC m=+151.245748951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.692284 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.692668 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:11.192656238 +0000 UTC m=+151.246641147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.794105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.794309 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:11.294285112 +0000 UTC m=+151.348270001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.794640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.794988 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:11.294976722 +0000 UTC m=+151.348961611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.896111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.896488 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:11.396461042 +0000 UTC m=+151.450445941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.985289 4749 patch_prober.go:28] interesting pod/router-default-5444994796-d2qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:08:10 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Oct 01 13:08:10 crc kubenswrapper[4749]: [+]process-running ok Oct 01 13:08:10 crc kubenswrapper[4749]: healthz check failed Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.985342 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-d2qqz" podUID="97489cec-2ad5-4b5c-89fd-d51ab641c126" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:08:10 crc kubenswrapper[4749]: I1001 13:08:10.997432 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:10 crc kubenswrapper[4749]: E1001 13:08:10.998024 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:11.498007854 +0000 UTC m=+151.551992753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.098509 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.098841 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:11.598817874 +0000 UTC m=+151.652802773 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.106114 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-nwwtl" event={"ID":"78eab148-c7db-4caf-99dd-7576fdee2366","Type":"ContainerStarted","Data":"f25367a0ab8db5b9d4f9753fc10abdfcc720f5817c6a0a861ed6399edf35d4b3"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.108933 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h699h" event={"ID":"89ddaa81-0040-4e7c-8207-dc77ca8b888a","Type":"ContainerStarted","Data":"65e47e97c83c55f89453b58cd360988dbb6bf9f6edacade7910eb8ffc2366516"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.113729 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x9q82" event={"ID":"1d30b63f-8173-4be0-a523-f69f604cb48a","Type":"ContainerStarted","Data":"b31af8c42b321c7d8abc330174ed1750c1c3cb0c733bd872e5beda88ecf0c775"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.113769 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x9q82" event={"ID":"1d30b63f-8173-4be0-a523-f69f604cb48a","Type":"ContainerStarted","Data":"b6e3ea209960f32fda0b28458b4044992f264817449b7571af78b856a2eaf645"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.115284 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" event={"ID":"13fabc72-2ab9-42f6-8e89-7868e3dd0228","Type":"ContainerStarted","Data":"14addc0a876461919621ac576937852a0610bba2b2ba7934e6d4efd7c5ca65b7"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.118286 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gb6ks" event={"ID":"e7332bc1-2523-40a2-9a9c-494940d63f5b","Type":"ContainerStarted","Data":"2483a882de814a1d75a1d1a8c1b421559ef4a8d61f7496d34e7f7c7e6dc3f13b"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.118346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gb6ks" event={"ID":"e7332bc1-2523-40a2-9a9c-494940d63f5b","Type":"ContainerStarted","Data":"92c934283c25616ab5f00240d625fe9890e7e8153b63610bf5f03a9a388a0ed4"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.119658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" event={"ID":"30a10511-6a81-4150-9ae3-976a8062accc","Type":"ContainerStarted","Data":"a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.120322 4749 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-4v2g5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.120359 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" podUID="30a10511-6a81-4150-9ae3-976a8062accc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.121116 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" event={"ID":"6d282991-125e-45a4-b6d7-65c5919cfbfa","Type":"ContainerStarted","Data":"8b8372e6947960444f67faa2b1c891593c682ece03a552931b92664ad0c230dd"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.123113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nrjl4" event={"ID":"c4026781-41c8-4fbc-8350-a1ed556054b0","Type":"ContainerStarted","Data":"cd146bcf9bd3ee0f025e747ad632d9928857b2a55091804e6b48990ed7d2cebc"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.123561 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.125171 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-2sc4v" event={"ID":"e7b739c8-2b6d-47da-b5a6-f40a3a1ee298","Type":"ContainerStarted","Data":"6f757cf1648fd6d46cce62314f51a00fe9a0f23070985171ac9d2cd90e203957"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.127095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" event={"ID":"2c421ec0-cb43-4116-87a5-9621322f8a33","Type":"ContainerStarted","Data":"1c81386dcb69f5c241b2b7c058c507ced0f089d0b70862f349f698f7a4c5157a"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.128827 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4bxbv" event={"ID":"e4084a94-565b-4599-b786-e1c0ed986450","Type":"ContainerStarted","Data":"d13d14771da708bd069b811574be4630d2eef89e1eddf69967b3b7e43c418826"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.130316 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" podStartSLOduration=127.13030471 podStartE2EDuration="2m7.13030471s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:10.352678016 +0000 UTC m=+150.406662915" watchObservedRunningTime="2025-10-01 13:08:11.13030471 +0000 UTC m=+151.184289619" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.137318 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" event={"ID":"0499d12a-7da9-4ece-9b62-87b0141c91f1","Type":"ContainerStarted","Data":"016e7142343931c38a625153824d9257cb64d94c3bdc1957911ccb993b6b9300"} Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.137805 4749 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4g84g container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.137851 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" podUID="f9b5edf5-c6f0-48e3-b97c-c26c2939ec50" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.138634 4749 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6smx5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.138749 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" podUID="e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.139786 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-n4kx2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.139834 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" podUID="85af42c2-35fc-4545-8fb2-22ab8beb3e22" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.151489 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-4chs2" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.156203 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-x7drh" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.192750 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-h699h" podStartSLOduration=127.192730536 podStartE2EDuration="2m7.192730536s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:11.133163242 +0000 UTC m=+151.187148151" watchObservedRunningTime="2025-10-01 13:08:11.192730536 +0000 UTC m=+151.246715435" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.192979 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jrr4f" podStartSLOduration=127.192976323 podStartE2EDuration="2m7.192976323s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:11.192802818 +0000 UTC m=+151.246787727" watchObservedRunningTime="2025-10-01 13:08:11.192976323 +0000 UTC m=+151.246961212" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.201596 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.206821 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:11.706807871 +0000 UTC m=+151.760792760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.275518 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-gb6ks" podStartSLOduration=127.275500308 podStartE2EDuration="2m7.275500308s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:11.275020674 +0000 UTC m=+151.329005573" watchObservedRunningTime="2025-10-01 13:08:11.275500308 +0000 UTC m=+151.329485217" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.276988 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nrjl4" podStartSLOduration=7.27698236 podStartE2EDuration="7.27698236s" podCreationTimestamp="2025-10-01 13:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:11.251561039 +0000 UTC m=+151.305545938" watchObservedRunningTime="2025-10-01 13:08:11.27698236 +0000 UTC m=+151.330967259" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.302468 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nxdhr" podStartSLOduration=126.302434143 podStartE2EDuration="2m6.302434143s" podCreationTimestamp="2025-10-01 13:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:11.300769785 +0000 UTC m=+151.354754684" watchObservedRunningTime="2025-10-01 13:08:11.302434143 +0000 UTC m=+151.356419042" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.303892 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.304292 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:11.804277656 +0000 UTC m=+151.858262555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.358234 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-x9q82" podStartSLOduration=127.358200467 podStartE2EDuration="2m7.358200467s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:11.354712547 +0000 UTC m=+151.408697446" watchObservedRunningTime="2025-10-01 13:08:11.358200467 +0000 UTC m=+151.412185366" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.404996 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.405281 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:11.905270982 +0000 UTC m=+151.959255881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.445506 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" podStartSLOduration=126.445488089 podStartE2EDuration="2m6.445488089s" podCreationTimestamp="2025-10-01 13:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:11.421407486 +0000 UTC m=+151.475392385" watchObservedRunningTime="2025-10-01 13:08:11.445488089 +0000 UTC m=+151.499472988" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.506403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.506629 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.006604837 +0000 UTC m=+152.060589736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.506767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.507061 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.00705528 +0000 UTC m=+152.061040169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.607896 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.608073 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.108043066 +0000 UTC m=+152.162027965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.608269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.608608 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.108594402 +0000 UTC m=+152.162579301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.709741 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.709942 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.209916487 +0000 UTC m=+152.263901386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.710038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.710330 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.210318129 +0000 UTC m=+152.264303028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.810928 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.811111 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.311084548 +0000 UTC m=+152.365069447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.811536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.811797 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.311789148 +0000 UTC m=+152.365774047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.912495 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.912704 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.412679591 +0000 UTC m=+152.466664490 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.912820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:11 crc kubenswrapper[4749]: E1001 13:08:11.913088 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.413076552 +0000 UTC m=+152.467061451 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.980318 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.980365 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.981762 4749 patch_prober.go:28] interesting pod/apiserver-76f77b778f-x9q82 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.981820 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-x9q82" podUID="1d30b63f-8173-4be0-a523-f69f604cb48a" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.11:8443/livez\": dial tcp 10.217.0.11:8443: connect: connection refused" Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.990682 4749 patch_prober.go:28] interesting pod/router-default-5444994796-d2qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:08:11 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Oct 01 13:08:11 crc kubenswrapper[4749]: [+]process-running ok Oct 01 13:08:11 crc kubenswrapper[4749]: healthz check failed Oct 01 13:08:11 crc kubenswrapper[4749]: I1001 13:08:11.990737 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-d2qqz" podUID="97489cec-2ad5-4b5c-89fd-d51ab641c126" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.014380 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:12 crc kubenswrapper[4749]: E1001 13:08:12.014598 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.514566242 +0000 UTC m=+152.568551141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.014895 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:12 crc kubenswrapper[4749]: E1001 13:08:12.015278 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.515267463 +0000 UTC m=+152.569252362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.031009 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.031060 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.115958 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:12 crc kubenswrapper[4749]: E1001 13:08:12.116179 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.616150855 +0000 UTC m=+152.670135754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.116490 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:12 crc kubenswrapper[4749]: E1001 13:08:12.116777 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.616769903 +0000 UTC m=+152.670754802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.147035 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" event={"ID":"2c421ec0-cb43-4116-87a5-9621322f8a33","Type":"ContainerStarted","Data":"fc35070b1e6da8b005cd1f3eddb7d0035996dc6a858941e1394708a6e8fc6cab"} Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.150812 4749 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-n4kx2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.150856 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" podUID="85af42c2-35fc-4545-8fb2-22ab8beb3e22" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.155604 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.217789 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.219063 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.219181 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.219434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.219541 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:08:12 crc kubenswrapper[4749]: E1001 13:08:12.224828 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.724810992 +0000 UTC m=+152.778795891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.229421 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.231349 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.237831 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.245441 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.320579 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:12 crc kubenswrapper[4749]: E1001 13:08:12.320923 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.820908967 +0000 UTC m=+152.874893866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.361477 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.390465 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.402401 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.421664 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:12 crc kubenswrapper[4749]: E1001 13:08:12.421989 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:12.921970724 +0000 UTC m=+152.975955623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.523115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:12 crc kubenswrapper[4749]: E1001 13:08:12.523491 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:13.023478745 +0000 UTC m=+153.077463644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.623712 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:12 crc kubenswrapper[4749]: E1001 13:08:12.624427 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:13.124412189 +0000 UTC m=+153.178397088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.726903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:12 crc kubenswrapper[4749]: E1001 13:08:12.727210 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:13.227198457 +0000 UTC m=+153.281183356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.766622 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4g84g" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.832797 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:12 crc kubenswrapper[4749]: E1001 13:08:12.833534 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:13.333518136 +0000 UTC m=+153.387503035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.855835 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sbgjp"] Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.858807 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.862357 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sbgjp"] Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.864601 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.940703 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-utilities\") pod \"community-operators-sbgjp\" (UID: \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\") " pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.940740 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szhzb\" (UniqueName: \"kubernetes.io/projected/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-kube-api-access-szhzb\") pod \"community-operators-sbgjp\" (UID: \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\") " pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.940799 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.940874 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-catalog-content\") pod \"community-operators-sbgjp\" (UID: \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\") " pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:08:12 crc kubenswrapper[4749]: E1001 13:08:12.941259 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:13.441245595 +0000 UTC m=+153.495230484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.942776 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:12 crc kubenswrapper[4749]: I1001 13:08:12.958651 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2hdb5" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.006008 4749 patch_prober.go:28] interesting pod/router-default-5444994796-d2qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:08:13 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Oct 01 13:08:13 crc kubenswrapper[4749]: [+]process-running ok Oct 01 13:08:13 crc kubenswrapper[4749]: healthz check failed Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.006078 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-d2qqz" podUID="97489cec-2ad5-4b5c-89fd-d51ab641c126" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.011939 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l7q4w"] Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.013503 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.031324 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.041715 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.042031 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-catalog-content\") pod \"community-operators-sbgjp\" (UID: \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\") " pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.042174 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-utilities\") pod \"community-operators-sbgjp\" (UID: \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\") " pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.042288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szhzb\" (UniqueName: \"kubernetes.io/projected/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-kube-api-access-szhzb\") pod \"community-operators-sbgjp\" (UID: \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\") " pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:08:13 crc kubenswrapper[4749]: E1001 13:08:13.042771 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:13.542754476 +0000 UTC m=+153.596739375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.043283 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-catalog-content\") pod \"community-operators-sbgjp\" (UID: \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\") " pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.043709 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-utilities\") pod \"community-operators-sbgjp\" (UID: \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\") " pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.048114 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l7q4w"] Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.100045 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szhzb\" (UniqueName: \"kubernetes.io/projected/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-kube-api-access-szhzb\") pod \"community-operators-sbgjp\" (UID: \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\") " pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.145181 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e36eb4c-e668-403b-93d8-1940029337fc-catalog-content\") pod \"certified-operators-l7q4w\" (UID: \"4e36eb4c-e668-403b-93d8-1940029337fc\") " pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.145257 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qcsj\" (UniqueName: \"kubernetes.io/projected/4e36eb4c-e668-403b-93d8-1940029337fc-kube-api-access-4qcsj\") pod \"certified-operators-l7q4w\" (UID: \"4e36eb4c-e668-403b-93d8-1940029337fc\") " pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.145333 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e36eb4c-e668-403b-93d8-1940029337fc-utilities\") pod \"certified-operators-l7q4w\" (UID: \"4e36eb4c-e668-403b-93d8-1940029337fc\") " pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.145378 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:13 crc kubenswrapper[4749]: E1001 13:08:13.145725 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:13.645711868 +0000 UTC m=+153.699696767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.203007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" event={"ID":"2c421ec0-cb43-4116-87a5-9621322f8a33","Type":"ContainerStarted","Data":"10ad7f1ee43bcf98b6bfd29473f1a66a42e55f4ff1464ffd9c97581f3259c8fa"} Oct 01 13:08:13 crc kubenswrapper[4749]: W1001 13:08:13.212569 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-7d8828ece1441dcb2bf782cd1fb132ab20cf2cc49d7c8fb71fb8f08cc4555980 WatchSource:0}: Error finding container 7d8828ece1441dcb2bf782cd1fb132ab20cf2cc49d7c8fb71fb8f08cc4555980: Status 404 returned error can't find the container with id 7d8828ece1441dcb2bf782cd1fb132ab20cf2cc49d7c8fb71fb8f08cc4555980 Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.224751 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t78r4" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.229520 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.244743 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kw79f"] Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.245687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.246338 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.246620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qcsj\" (UniqueName: \"kubernetes.io/projected/4e36eb4c-e668-403b-93d8-1940029337fc-kube-api-access-4qcsj\") pod \"certified-operators-l7q4w\" (UID: \"4e36eb4c-e668-403b-93d8-1940029337fc\") " pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.246938 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e36eb4c-e668-403b-93d8-1940029337fc-utilities\") pod \"certified-operators-l7q4w\" (UID: \"4e36eb4c-e668-403b-93d8-1940029337fc\") " pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.247037 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e36eb4c-e668-403b-93d8-1940029337fc-catalog-content\") pod \"certified-operators-l7q4w\" (UID: \"4e36eb4c-e668-403b-93d8-1940029337fc\") " pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.247476 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e36eb4c-e668-403b-93d8-1940029337fc-catalog-content\") pod \"certified-operators-l7q4w\" (UID: \"4e36eb4c-e668-403b-93d8-1940029337fc\") " pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:08:13 crc kubenswrapper[4749]: E1001 13:08:13.247621 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:13.74760518 +0000 UTC m=+153.801590079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.248129 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e36eb4c-e668-403b-93d8-1940029337fc-utilities\") pod \"certified-operators-l7q4w\" (UID: \"4e36eb4c-e668-403b-93d8-1940029337fc\") " pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.262408 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kw79f"] Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.295454 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qcsj\" (UniqueName: \"kubernetes.io/projected/4e36eb4c-e668-403b-93d8-1940029337fc-kube-api-access-4qcsj\") pod \"certified-operators-l7q4w\" (UID: \"4e36eb4c-e668-403b-93d8-1940029337fc\") " pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.313211 4749 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.352998 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad908df-119f-4496-a55d-64eb43918142-catalog-content\") pod \"community-operators-kw79f\" (UID: \"1ad908df-119f-4496-a55d-64eb43918142\") " pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.353351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad908df-119f-4496-a55d-64eb43918142-utilities\") pod \"community-operators-kw79f\" (UID: \"1ad908df-119f-4496-a55d-64eb43918142\") " pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.353654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.353846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wndc9\" (UniqueName: \"kubernetes.io/projected/1ad908df-119f-4496-a55d-64eb43918142-kube-api-access-wndc9\") pod \"community-operators-kw79f\" (UID: \"1ad908df-119f-4496-a55d-64eb43918142\") " pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:08:13 crc kubenswrapper[4749]: E1001 13:08:13.355014 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:13.85499951 +0000 UTC m=+153.908984399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.361014 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.413584 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xp57r"] Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.414677 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:08:13 crc kubenswrapper[4749]: W1001 13:08:13.414813 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-3aa7a1382cd069c02e315627fbe931de10ecc0e70454349ac97ca441d93779ec WatchSource:0}: Error finding container 3aa7a1382cd069c02e315627fbe931de10ecc0e70454349ac97ca441d93779ec: Status 404 returned error can't find the container with id 3aa7a1382cd069c02e315627fbe931de10ecc0e70454349ac97ca441d93779ec Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.436975 4749 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-01T13:08:13.313245579Z","Handler":null,"Name":""} Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.455823 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:13 crc kubenswrapper[4749]: E1001 13:08:13.456629 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 13:08:13.956599313 +0000 UTC m=+154.010584212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.458072 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad908df-119f-4496-a55d-64eb43918142-catalog-content\") pod \"community-operators-kw79f\" (UID: \"1ad908df-119f-4496-a55d-64eb43918142\") " pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.458112 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad908df-119f-4496-a55d-64eb43918142-utilities\") pod \"community-operators-kw79f\" (UID: \"1ad908df-119f-4496-a55d-64eb43918142\") " pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.458167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.458205 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wndc9\" (UniqueName: \"kubernetes.io/projected/1ad908df-119f-4496-a55d-64eb43918142-kube-api-access-wndc9\") pod \"community-operators-kw79f\" (UID: \"1ad908df-119f-4496-a55d-64eb43918142\") " pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.457575 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xp57r"] Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.459132 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad908df-119f-4496-a55d-64eb43918142-catalog-content\") pod \"community-operators-kw79f\" (UID: \"1ad908df-119f-4496-a55d-64eb43918142\") " pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:08:13 crc kubenswrapper[4749]: E1001 13:08:13.459502 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 13:08:13.959486236 +0000 UTC m=+154.013471135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqndp" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.459756 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad908df-119f-4496-a55d-64eb43918142-utilities\") pod \"community-operators-kw79f\" (UID: \"1ad908df-119f-4496-a55d-64eb43918142\") " pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.491369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wndc9\" (UniqueName: \"kubernetes.io/projected/1ad908df-119f-4496-a55d-64eb43918142-kube-api-access-wndc9\") pod \"community-operators-kw79f\" (UID: \"1ad908df-119f-4496-a55d-64eb43918142\") " pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.513615 4749 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.513655 4749 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.566691 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.567023 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e79171-b4ab-495a-af2b-44ea7f6c91ef-catalog-content\") pod \"certified-operators-xp57r\" (UID: \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\") " pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.567095 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e79171-b4ab-495a-af2b-44ea7f6c91ef-utilities\") pod \"certified-operators-xp57r\" (UID: \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\") " pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.567142 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxl72\" (UniqueName: \"kubernetes.io/projected/40e79171-b4ab-495a-af2b-44ea7f6c91ef-kube-api-access-wxl72\") pod \"certified-operators-xp57r\" (UID: \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\") " pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.587300 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.633774 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.667763 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e79171-b4ab-495a-af2b-44ea7f6c91ef-catalog-content\") pod \"certified-operators-xp57r\" (UID: \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\") " pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.667845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e79171-b4ab-495a-af2b-44ea7f6c91ef-utilities\") pod \"certified-operators-xp57r\" (UID: \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\") " pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.667876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.667902 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxl72\" (UniqueName: \"kubernetes.io/projected/40e79171-b4ab-495a-af2b-44ea7f6c91ef-kube-api-access-wxl72\") pod \"certified-operators-xp57r\" (UID: \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\") " pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.668588 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e79171-b4ab-495a-af2b-44ea7f6c91ef-catalog-content\") pod \"certified-operators-xp57r\" (UID: \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\") " pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.668787 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e79171-b4ab-495a-af2b-44ea7f6c91ef-utilities\") pod \"certified-operators-xp57r\" (UID: \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\") " pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.721277 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.721427 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.756017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxl72\" (UniqueName: \"kubernetes.io/projected/40e79171-b4ab-495a-af2b-44ea7f6c91ef-kube-api-access-wxl72\") pod \"certified-operators-xp57r\" (UID: \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\") " pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.810807 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.811660 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sbgjp"] Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.888563 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.889163 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.897838 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.897991 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.916329 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.986686 4749 patch_prober.go:28] interesting pod/router-default-5444994796-d2qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:08:13 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Oct 01 13:08:13 crc kubenswrapper[4749]: [+]process-running ok Oct 01 13:08:13 crc kubenswrapper[4749]: healthz check failed Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.986741 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-d2qqz" podUID="97489cec-2ad5-4b5c-89fd-d51ab641c126" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.987889 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f88ca20-b404-4d0d-8b3d-dd9171928e81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1f88ca20-b404-4d0d-8b3d-dd9171928e81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:08:13 crc kubenswrapper[4749]: I1001 13:08:13.987957 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1f88ca20-b404-4d0d-8b3d-dd9171928e81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1f88ca20-b404-4d0d-8b3d-dd9171928e81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.089125 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f88ca20-b404-4d0d-8b3d-dd9171928e81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1f88ca20-b404-4d0d-8b3d-dd9171928e81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.089526 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1f88ca20-b404-4d0d-8b3d-dd9171928e81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1f88ca20-b404-4d0d-8b3d-dd9171928e81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.089590 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1f88ca20-b404-4d0d-8b3d-dd9171928e81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1f88ca20-b404-4d0d-8b3d-dd9171928e81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.122048 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f88ca20-b404-4d0d-8b3d-dd9171928e81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1f88ca20-b404-4d0d-8b3d-dd9171928e81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.128355 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l7q4w"] Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.187975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqndp\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.231475 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.260008 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.292970 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f2983e2637633aa2a945113626518af1adc503d316b0a72f0489534a4c8c6356"} Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.293015 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4c88775ca620d3c85c2c9612d6a269b1eea003d4451336c293fc88e8946e8287"} Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.293158 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.300717 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kw79f"] Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.314044 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7q4w" event={"ID":"4e36eb4c-e668-403b-93d8-1940029337fc","Type":"ContainerStarted","Data":"f498491b6c1aa9befad3987d37796aa07aafa378ff1de5dd6dabff8782bccab1"} Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.329839 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgjp" event={"ID":"73c319a1-ef16-4c75-b18b-0fed4fba7fb7","Type":"ContainerStarted","Data":"2088cbd4f3e5eb5bbe03bfbcd887a867a7616a94c41f160ce7e9ee422ed52ceb"} Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.329906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgjp" event={"ID":"73c319a1-ef16-4c75-b18b-0fed4fba7fb7","Type":"ContainerStarted","Data":"bd2ccbab8584bd1a7fc05d16bbcef4cbdef9d0852aa32850a6af5bb508a6924f"} Oct 01 13:08:14 crc kubenswrapper[4749]: W1001 13:08:14.343866 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ad908df_119f_4496_a55d_64eb43918142.slice/crio-c88ee95e01794be93de1e9468a8bb0abef8dd9a780dc9694dcf84d5c62139bc9 WatchSource:0}: Error finding container c88ee95e01794be93de1e9468a8bb0abef8dd9a780dc9694dcf84d5c62139bc9: Status 404 returned error can't find the container with id c88ee95e01794be93de1e9468a8bb0abef8dd9a780dc9694dcf84d5c62139bc9 Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.343943 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.345911 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" event={"ID":"2c421ec0-cb43-4116-87a5-9621322f8a33","Type":"ContainerStarted","Data":"d2b056accc7f1c9ca9d163f43fe0594e1d33dd5dedf781c8a91f62530ddef8a0"} Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.357604 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"74d5cac12b3f76d40ac6c0671d1611cf25820023e1d28a37056ce76ed3e116ff"} Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.357642 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3aa7a1382cd069c02e315627fbe931de10ecc0e70454349ac97ca441d93779ec"} Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.363052 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"95cdf9f959ec59b1f4187ca52a8171f568b3f55d25d8af0caac14906f7b6afca"} Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.363095 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7d8828ece1441dcb2bf782cd1fb132ab20cf2cc49d7c8fb71fb8f08cc4555980"} Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.382174 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xp57r"] Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.756007 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4kcnk" podStartSLOduration=10.755991849 podStartE2EDuration="10.755991849s" podCreationTimestamp="2025-10-01 13:08:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:14.488462811 +0000 UTC m=+154.542447720" watchObservedRunningTime="2025-10-01 13:08:14.755991849 +0000 UTC m=+154.809976748" Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.761291 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.861407 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqndp"] Oct 01 13:08:14 crc kubenswrapper[4749]: W1001 13:08:14.875101 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ba15709_180a_4045_9d19_df6de2d8cf6e.slice/crio-fc4bcd2ed4a514970d710f3e4a639cdab2a9bf06b7bda3e9ad2adbcf5589e5b4 WatchSource:0}: Error finding container fc4bcd2ed4a514970d710f3e4a639cdab2a9bf06b7bda3e9ad2adbcf5589e5b4: Status 404 returned error can't find the container with id fc4bcd2ed4a514970d710f3e4a639cdab2a9bf06b7bda3e9ad2adbcf5589e5b4 Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.986907 4749 patch_prober.go:28] interesting pod/router-default-5444994796-d2qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:08:14 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Oct 01 13:08:14 crc kubenswrapper[4749]: [+]process-running ok Oct 01 13:08:14 crc kubenswrapper[4749]: healthz check failed Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.986963 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-d2qqz" podUID="97489cec-2ad5-4b5c-89fd-d51ab641c126" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.993956 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gzxsg"] Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.994880 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:08:14 crc kubenswrapper[4749]: I1001 13:08:14.998287 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.009763 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzxsg"] Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.120371 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv6kd\" (UniqueName: \"kubernetes.io/projected/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-kube-api-access-mv6kd\") pod \"redhat-marketplace-gzxsg\" (UID: \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\") " pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.120437 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-utilities\") pod \"redhat-marketplace-gzxsg\" (UID: \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\") " pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.120487 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-catalog-content\") pod \"redhat-marketplace-gzxsg\" (UID: \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\") " pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.221309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv6kd\" (UniqueName: \"kubernetes.io/projected/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-kube-api-access-mv6kd\") pod \"redhat-marketplace-gzxsg\" (UID: \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\") " pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.221750 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-utilities\") pod \"redhat-marketplace-gzxsg\" (UID: \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\") " pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.221780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-catalog-content\") pod \"redhat-marketplace-gzxsg\" (UID: \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\") " pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.222656 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-catalog-content\") pod \"redhat-marketplace-gzxsg\" (UID: \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\") " pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.223311 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-utilities\") pod \"redhat-marketplace-gzxsg\" (UID: \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\") " pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.245000 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.246940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv6kd\" (UniqueName: \"kubernetes.io/projected/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-kube-api-access-mv6kd\") pod \"redhat-marketplace-gzxsg\" (UID: \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\") " pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.311935 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.375150 4749 generic.go:334] "Generic (PLEG): container finished" podID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" containerID="c406e2b954937a18038f8c54a4fa5b8295df59cb695bb529b5dfdcdc21f24939" exitCode=0 Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.375260 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp57r" event={"ID":"40e79171-b4ab-495a-af2b-44ea7f6c91ef","Type":"ContainerDied","Data":"c406e2b954937a18038f8c54a4fa5b8295df59cb695bb529b5dfdcdc21f24939"} Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.375311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp57r" event={"ID":"40e79171-b4ab-495a-af2b-44ea7f6c91ef","Type":"ContainerStarted","Data":"d2c6042186b9ce44eb34eda60432437ad00d9cce4c3bb3e97c67c25aeae49946"} Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.383243 4749 generic.go:334] "Generic (PLEG): container finished" podID="1ad908df-119f-4496-a55d-64eb43918142" containerID="f0b5ddaf52dfbe67fd329780e8c18e93a003e358af9c53462eb1352762f108e4" exitCode=0 Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.383346 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw79f" event={"ID":"1ad908df-119f-4496-a55d-64eb43918142","Type":"ContainerDied","Data":"f0b5ddaf52dfbe67fd329780e8c18e93a003e358af9c53462eb1352762f108e4"} Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.383438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw79f" event={"ID":"1ad908df-119f-4496-a55d-64eb43918142","Type":"ContainerStarted","Data":"c88ee95e01794be93de1e9468a8bb0abef8dd9a780dc9694dcf84d5c62139bc9"} Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.390990 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2vddf"] Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.391998 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e36eb4c-e668-403b-93d8-1940029337fc" containerID="7fa4e070b331a1c1f66e1919ccc5e061d638cc1f04e61aa88ffe9fdda3424da5" exitCode=0 Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.392048 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.392064 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7q4w" event={"ID":"4e36eb4c-e668-403b-93d8-1940029337fc","Type":"ContainerDied","Data":"7fa4e070b331a1c1f66e1919ccc5e061d638cc1f04e61aa88ffe9fdda3424da5"} Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.400668 4749 generic.go:334] "Generic (PLEG): container finished" podID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" containerID="2088cbd4f3e5eb5bbe03bfbcd887a867a7616a94c41f160ce7e9ee422ed52ceb" exitCode=0 Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.400751 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgjp" event={"ID":"73c319a1-ef16-4c75-b18b-0fed4fba7fb7","Type":"ContainerDied","Data":"2088cbd4f3e5eb5bbe03bfbcd887a867a7616a94c41f160ce7e9ee422ed52ceb"} Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.404201 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" event={"ID":"8ba15709-180a-4045-9d19-df6de2d8cf6e","Type":"ContainerStarted","Data":"11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2"} Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.404293 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" event={"ID":"8ba15709-180a-4045-9d19-df6de2d8cf6e","Type":"ContainerStarted","Data":"fc4bcd2ed4a514970d710f3e4a639cdab2a9bf06b7bda3e9ad2adbcf5589e5b4"} Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.404420 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.405412 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vddf"] Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.406867 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1f88ca20-b404-4d0d-8b3d-dd9171928e81","Type":"ContainerStarted","Data":"7f926801295bb71c44070aeea5c478d957bcd898f0775758b4e811bef9de72c7"} Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.406895 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1f88ca20-b404-4d0d-8b3d-dd9171928e81","Type":"ContainerStarted","Data":"119348ea2602b23138f692bf1d913b4370ec82fa11c345ab892b2c840566be4d"} Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.481012 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.480995259 podStartE2EDuration="2.480995259s" podCreationTimestamp="2025-10-01 13:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:15.480403512 +0000 UTC m=+155.534388411" watchObservedRunningTime="2025-10-01 13:08:15.480995259 +0000 UTC m=+155.534980158" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.498700 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" podStartSLOduration=131.498684238 podStartE2EDuration="2m11.498684238s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:15.498348758 +0000 UTC m=+155.552333677" watchObservedRunningTime="2025-10-01 13:08:15.498684238 +0000 UTC m=+155.552669137" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.533881 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-catalog-content\") pod \"redhat-marketplace-2vddf\" (UID: \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\") " pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.533939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbm2g\" (UniqueName: \"kubernetes.io/projected/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-kube-api-access-dbm2g\") pod \"redhat-marketplace-2vddf\" (UID: \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\") " pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.534162 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-utilities\") pod \"redhat-marketplace-2vddf\" (UID: \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\") " pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.635279 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-catalog-content\") pod \"redhat-marketplace-2vddf\" (UID: \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\") " pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.635568 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbm2g\" (UniqueName: \"kubernetes.io/projected/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-kube-api-access-dbm2g\") pod \"redhat-marketplace-2vddf\" (UID: \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\") " pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.635650 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-utilities\") pod \"redhat-marketplace-2vddf\" (UID: \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\") " pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.636000 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-catalog-content\") pod \"redhat-marketplace-2vddf\" (UID: \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\") " pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.636053 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-utilities\") pod \"redhat-marketplace-2vddf\" (UID: \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\") " pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.652525 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbm2g\" (UniqueName: \"kubernetes.io/projected/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-kube-api-access-dbm2g\") pod \"redhat-marketplace-2vddf\" (UID: \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\") " pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.714273 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.805526 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzxsg"] Oct 01 13:08:15 crc kubenswrapper[4749]: W1001 13:08:15.828373 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c9a345f_5d20_4673_a6bb_e70e387e5d8f.slice/crio-6785d623bb028c5b82925049dcacbe2be55ba131de482a8ab3dff238c1ccf5c2 WatchSource:0}: Error finding container 6785d623bb028c5b82925049dcacbe2be55ba131de482a8ab3dff238c1ccf5c2: Status 404 returned error can't find the container with id 6785d623bb028c5b82925049dcacbe2be55ba131de482a8ab3dff238c1ccf5c2 Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.923271 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vddf"] Oct 01 13:08:15 crc kubenswrapper[4749]: W1001 13:08:15.926156 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2016f7a2_9df5_474b_92e4_1bf3cd8175e3.slice/crio-0d861d7347bf860392d94ceb9a403781af01d3862ee769e41592ed8f4b64f875 WatchSource:0}: Error finding container 0d861d7347bf860392d94ceb9a403781af01d3862ee769e41592ed8f4b64f875: Status 404 returned error can't find the container with id 0d861d7347bf860392d94ceb9a403781af01d3862ee769e41592ed8f4b64f875 Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.987288 4749 patch_prober.go:28] interesting pod/router-default-5444994796-d2qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:08:15 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Oct 01 13:08:15 crc kubenswrapper[4749]: [+]process-running ok Oct 01 13:08:15 crc kubenswrapper[4749]: healthz check failed Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.987354 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-d2qqz" podUID="97489cec-2ad5-4b5c-89fd-d51ab641c126" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.995895 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2nbq9"] Oct 01 13:08:15 crc kubenswrapper[4749]: I1001 13:08:15.996808 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.012021 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.030525 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2nbq9"] Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.147950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-catalog-content\") pod \"redhat-operators-2nbq9\" (UID: \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\") " pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.148064 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h6jl\" (UniqueName: \"kubernetes.io/projected/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-kube-api-access-7h6jl\") pod \"redhat-operators-2nbq9\" (UID: \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\") " pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.148101 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-utilities\") pod \"redhat-operators-2nbq9\" (UID: \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\") " pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.157369 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.158003 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.161376 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.162107 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.176632 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.250893 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-utilities\") pod \"redhat-operators-2nbq9\" (UID: \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\") " pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.250942 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-catalog-content\") pod \"redhat-operators-2nbq9\" (UID: \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\") " pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.250965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.251036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h6jl\" (UniqueName: \"kubernetes.io/projected/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-kube-api-access-7h6jl\") pod \"redhat-operators-2nbq9\" (UID: \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\") " pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.251068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.251689 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-utilities\") pod \"redhat-operators-2nbq9\" (UID: \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\") " pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.252572 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-catalog-content\") pod \"redhat-operators-2nbq9\" (UID: \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\") " pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.275960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h6jl\" (UniqueName: \"kubernetes.io/projected/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-kube-api-access-7h6jl\") pod \"redhat-operators-2nbq9\" (UID: \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\") " pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.299634 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-2j6m9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.299684 4749 patch_prober.go:28] interesting pod/downloads-7954f5f757-2j6m9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" start-of-body= Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.299736 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2j6m9" podUID="43aae4a1-9504-4c81-9ff1-675c0b51ced2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.299691 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2j6m9" podUID="43aae4a1-9504-4c81-9ff1-675c0b51ced2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.9:8080/\": dial tcp 10.217.0.9:8080: connect: connection refused" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.333937 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.353317 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.353430 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.353587 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.374716 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.398702 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dkxgb"] Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.400141 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.405621 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dkxgb"] Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.415109 4749 generic.go:334] "Generic (PLEG): container finished" podID="d5321a24-d271-46cb-9d0a-fde8089a6ddc" containerID="d09e4abe7c9b199e38017b52d57c783aac5ad187ac71bb789a2d4d0f1d648826" exitCode=0 Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.415183 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" event={"ID":"d5321a24-d271-46cb-9d0a-fde8089a6ddc","Type":"ContainerDied","Data":"d09e4abe7c9b199e38017b52d57c783aac5ad187ac71bb789a2d4d0f1d648826"} Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.416503 4749 generic.go:334] "Generic (PLEG): container finished" podID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" containerID="a1f22af2c4c943a99ad3349ef4ae3aca19ebc0bd25b3de20eadced023cc4ef70" exitCode=0 Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.416550 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vddf" event={"ID":"2016f7a2-9df5-474b-92e4-1bf3cd8175e3","Type":"ContainerDied","Data":"a1f22af2c4c943a99ad3349ef4ae3aca19ebc0bd25b3de20eadced023cc4ef70"} Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.416566 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vddf" event={"ID":"2016f7a2-9df5-474b-92e4-1bf3cd8175e3","Type":"ContainerStarted","Data":"0d861d7347bf860392d94ceb9a403781af01d3862ee769e41592ed8f4b64f875"} Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.423543 4749 generic.go:334] "Generic (PLEG): container finished" podID="1f88ca20-b404-4d0d-8b3d-dd9171928e81" containerID="7f926801295bb71c44070aeea5c478d957bcd898f0775758b4e811bef9de72c7" exitCode=0 Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.423687 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1f88ca20-b404-4d0d-8b3d-dd9171928e81","Type":"ContainerDied","Data":"7f926801295bb71c44070aeea5c478d957bcd898f0775758b4e811bef9de72c7"} Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.429701 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" containerID="8d7997bb1dbeec7de01b085a65e970053505dc49496a5b0dcd6cd15ee2540649" exitCode=0 Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.429795 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzxsg" event={"ID":"6c9a345f-5d20-4673-a6bb-e70e387e5d8f","Type":"ContainerDied","Data":"8d7997bb1dbeec7de01b085a65e970053505dc49496a5b0dcd6cd15ee2540649"} Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.429834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzxsg" event={"ID":"6c9a345f-5d20-4673-a6bb-e70e387e5d8f","Type":"ContainerStarted","Data":"6785d623bb028c5b82925049dcacbe2be55ba131de482a8ab3dff238c1ccf5c2"} Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.518083 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.559916 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/186102be-9b7e-4807-a141-6c85d69155df-catalog-content\") pod \"redhat-operators-dkxgb\" (UID: \"186102be-9b7e-4807-a141-6c85d69155df\") " pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.560259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbl4r\" (UniqueName: \"kubernetes.io/projected/186102be-9b7e-4807-a141-6c85d69155df-kube-api-access-nbl4r\") pod \"redhat-operators-dkxgb\" (UID: \"186102be-9b7e-4807-a141-6c85d69155df\") " pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.560351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/186102be-9b7e-4807-a141-6c85d69155df-utilities\") pod \"redhat-operators-dkxgb\" (UID: \"186102be-9b7e-4807-a141-6c85d69155df\") " pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.661660 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/186102be-9b7e-4807-a141-6c85d69155df-utilities\") pod \"redhat-operators-dkxgb\" (UID: \"186102be-9b7e-4807-a141-6c85d69155df\") " pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.661783 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/186102be-9b7e-4807-a141-6c85d69155df-catalog-content\") pod \"redhat-operators-dkxgb\" (UID: \"186102be-9b7e-4807-a141-6c85d69155df\") " pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.661814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbl4r\" (UniqueName: \"kubernetes.io/projected/186102be-9b7e-4807-a141-6c85d69155df-kube-api-access-nbl4r\") pod \"redhat-operators-dkxgb\" (UID: \"186102be-9b7e-4807-a141-6c85d69155df\") " pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.662692 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/186102be-9b7e-4807-a141-6c85d69155df-catalog-content\") pod \"redhat-operators-dkxgb\" (UID: \"186102be-9b7e-4807-a141-6c85d69155df\") " pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.662821 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/186102be-9b7e-4807-a141-6c85d69155df-utilities\") pod \"redhat-operators-dkxgb\" (UID: \"186102be-9b7e-4807-a141-6c85d69155df\") " pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.681022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbl4r\" (UniqueName: \"kubernetes.io/projected/186102be-9b7e-4807-a141-6c85d69155df-kube-api-access-nbl4r\") pod \"redhat-operators-dkxgb\" (UID: \"186102be-9b7e-4807-a141-6c85d69155df\") " pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.749256 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.796249 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2nbq9"] Oct 01 13:08:16 crc kubenswrapper[4749]: W1001 13:08:16.847394 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f96d1ba_50b5_4f78_83c6_a97f3617ac75.slice/crio-8372e4c25af46ace2499dfa54f632e5c4926e8da4b16ac697b5a101529cf5c11 WatchSource:0}: Error finding container 8372e4c25af46ace2499dfa54f632e5c4926e8da4b16ac697b5a101529cf5c11: Status 404 returned error can't find the container with id 8372e4c25af46ace2499dfa54f632e5c4926e8da4b16ac697b5a101529cf5c11 Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.877838 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.982879 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.986648 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.990755 4749 patch_prober.go:28] interesting pod/router-default-5444994796-d2qqz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 13:08:16 crc kubenswrapper[4749]: [-]has-synced failed: reason withheld Oct 01 13:08:16 crc kubenswrapper[4749]: [+]process-running ok Oct 01 13:08:16 crc kubenswrapper[4749]: healthz check failed Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.990821 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-d2qqz" podUID="97489cec-2ad5-4b5c-89fd-d51ab641c126" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 13:08:16 crc kubenswrapper[4749]: I1001 13:08:16.994622 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-x9q82" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.076851 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.128609 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.128918 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.142521 4749 patch_prober.go:28] interesting pod/console-f9d7485db-nsv4j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.37:8443/health\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.142559 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nsv4j" podUID="bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.37:8443/health\": dial tcp 10.217.0.37:8443: connect: connection refused" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.195682 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.222173 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dkxgb"] Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.444315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkxgb" event={"ID":"186102be-9b7e-4807-a141-6c85d69155df","Type":"ContainerStarted","Data":"38a649e037ef4cd1a2b760477e4a9397c769eabd63dfcf63aceea01490246820"} Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.446453 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" containerID="79e2f2a7f272e1b309f4e200559a0173b715512f3d5a5eb9cee22c9861102741" exitCode=0 Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.446498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nbq9" event={"ID":"6f96d1ba-50b5-4f78-83c6-a97f3617ac75","Type":"ContainerDied","Data":"79e2f2a7f272e1b309f4e200559a0173b715512f3d5a5eb9cee22c9861102741"} Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.446516 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nbq9" event={"ID":"6f96d1ba-50b5-4f78-83c6-a97f3617ac75","Type":"ContainerStarted","Data":"8372e4c25af46ace2499dfa54f632e5c4926e8da4b16ac697b5a101529cf5c11"} Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.447956 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2","Type":"ContainerStarted","Data":"1d5d7b10f7eba6b00e568ce908cb37c092d5cd97c8e86d6db8504aee0c881449"} Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.768808 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.774626 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.807069 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5321a24-d271-46cb-9d0a-fde8089a6ddc-config-volume\") pod \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\" (UID: \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\") " Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.807144 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lghp4\" (UniqueName: \"kubernetes.io/projected/d5321a24-d271-46cb-9d0a-fde8089a6ddc-kube-api-access-lghp4\") pod \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\" (UID: \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\") " Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.808189 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5321a24-d271-46cb-9d0a-fde8089a6ddc-config-volume" (OuterVolumeSpecName: "config-volume") pod "d5321a24-d271-46cb-9d0a-fde8089a6ddc" (UID: "d5321a24-d271-46cb-9d0a-fde8089a6ddc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.808289 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5321a24-d271-46cb-9d0a-fde8089a6ddc-secret-volume\") pod \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\" (UID: \"d5321a24-d271-46cb-9d0a-fde8089a6ddc\") " Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.809077 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f88ca20-b404-4d0d-8b3d-dd9171928e81-kube-api-access\") pod \"1f88ca20-b404-4d0d-8b3d-dd9171928e81\" (UID: \"1f88ca20-b404-4d0d-8b3d-dd9171928e81\") " Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.809165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1f88ca20-b404-4d0d-8b3d-dd9171928e81-kubelet-dir\") pod \"1f88ca20-b404-4d0d-8b3d-dd9171928e81\" (UID: \"1f88ca20-b404-4d0d-8b3d-dd9171928e81\") " Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.809463 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5321a24-d271-46cb-9d0a-fde8089a6ddc-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.809519 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f88ca20-b404-4d0d-8b3d-dd9171928e81-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1f88ca20-b404-4d0d-8b3d-dd9171928e81" (UID: "1f88ca20-b404-4d0d-8b3d-dd9171928e81"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.820633 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5321a24-d271-46cb-9d0a-fde8089a6ddc-kube-api-access-lghp4" (OuterVolumeSpecName: "kube-api-access-lghp4") pod "d5321a24-d271-46cb-9d0a-fde8089a6ddc" (UID: "d5321a24-d271-46cb-9d0a-fde8089a6ddc"). InnerVolumeSpecName "kube-api-access-lghp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.821065 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5321a24-d271-46cb-9d0a-fde8089a6ddc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d5321a24-d271-46cb-9d0a-fde8089a6ddc" (UID: "d5321a24-d271-46cb-9d0a-fde8089a6ddc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.821812 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f88ca20-b404-4d0d-8b3d-dd9171928e81-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1f88ca20-b404-4d0d-8b3d-dd9171928e81" (UID: "1f88ca20-b404-4d0d-8b3d-dd9171928e81"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.911160 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5321a24-d271-46cb-9d0a-fde8089a6ddc-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.911207 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1f88ca20-b404-4d0d-8b3d-dd9171928e81-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.911247 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1f88ca20-b404-4d0d-8b3d-dd9171928e81-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.911261 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lghp4\" (UniqueName: \"kubernetes.io/projected/d5321a24-d271-46cb-9d0a-fde8089a6ddc-kube-api-access-lghp4\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.993021 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:17 crc kubenswrapper[4749]: I1001 13:08:17.995658 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-d2qqz" Oct 01 13:08:18 crc kubenswrapper[4749]: I1001 13:08:18.467456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2","Type":"ContainerStarted","Data":"bf01cff0f61780c06d148ad62a4d22a4000e7cab1cb8fe87aa31324a840f902a"} Oct 01 13:08:18 crc kubenswrapper[4749]: I1001 13:08:18.469986 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" event={"ID":"d5321a24-d271-46cb-9d0a-fde8089a6ddc","Type":"ContainerDied","Data":"f28b89cca67bafeadbc385d6ed27b43692efc06ffdb95749c34a40e139cc00b7"} Oct 01 13:08:18 crc kubenswrapper[4749]: I1001 13:08:18.470014 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f28b89cca67bafeadbc385d6ed27b43692efc06ffdb95749c34a40e139cc00b7" Oct 01 13:08:18 crc kubenswrapper[4749]: I1001 13:08:18.470076 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx" Oct 01 13:08:18 crc kubenswrapper[4749]: I1001 13:08:18.472486 4749 generic.go:334] "Generic (PLEG): container finished" podID="186102be-9b7e-4807-a141-6c85d69155df" containerID="5896b7db6cb0928c743ee418853e11eeb78c9b38c1767a4fa20d7d908fa8d4e2" exitCode=0 Oct 01 13:08:18 crc kubenswrapper[4749]: I1001 13:08:18.472537 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkxgb" event={"ID":"186102be-9b7e-4807-a141-6c85d69155df","Type":"ContainerDied","Data":"5896b7db6cb0928c743ee418853e11eeb78c9b38c1767a4fa20d7d908fa8d4e2"} Oct 01 13:08:18 crc kubenswrapper[4749]: I1001 13:08:18.475975 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 13:08:18 crc kubenswrapper[4749]: I1001 13:08:18.477475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1f88ca20-b404-4d0d-8b3d-dd9171928e81","Type":"ContainerDied","Data":"119348ea2602b23138f692bf1d913b4370ec82fa11c345ab892b2c840566be4d"} Oct 01 13:08:18 crc kubenswrapper[4749]: I1001 13:08:18.477528 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="119348ea2602b23138f692bf1d913b4370ec82fa11c345ab892b2c840566be4d" Oct 01 13:08:18 crc kubenswrapper[4749]: I1001 13:08:18.503157 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.5031398620000003 podStartE2EDuration="2.503139862s" podCreationTimestamp="2025-10-01 13:08:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:18.485184596 +0000 UTC m=+158.539169495" watchObservedRunningTime="2025-10-01 13:08:18.503139862 +0000 UTC m=+158.557124761" Oct 01 13:08:19 crc kubenswrapper[4749]: I1001 13:08:19.252173 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nrjl4" Oct 01 13:08:19 crc kubenswrapper[4749]: I1001 13:08:19.498158 4749 generic.go:334] "Generic (PLEG): container finished" podID="3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2" containerID="bf01cff0f61780c06d148ad62a4d22a4000e7cab1cb8fe87aa31324a840f902a" exitCode=0 Oct 01 13:08:19 crc kubenswrapper[4749]: I1001 13:08:19.498204 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2","Type":"ContainerDied","Data":"bf01cff0f61780c06d148ad62a4d22a4000e7cab1cb8fe87aa31324a840f902a"} Oct 01 13:08:26 crc kubenswrapper[4749]: I1001 13:08:26.325733 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2j6m9" Oct 01 13:08:26 crc kubenswrapper[4749]: I1001 13:08:26.816073 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:08:26 crc kubenswrapper[4749]: I1001 13:08:26.988078 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2-kubelet-dir\") pod \"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2\" (UID: \"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2\") " Oct 01 13:08:26 crc kubenswrapper[4749]: I1001 13:08:26.988403 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2" (UID: "3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:08:26 crc kubenswrapper[4749]: I1001 13:08:26.988593 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2-kube-api-access\") pod \"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2\" (UID: \"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2\") " Oct 01 13:08:26 crc kubenswrapper[4749]: I1001 13:08:26.989771 4749 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:26 crc kubenswrapper[4749]: I1001 13:08:26.994262 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2" (UID: "3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:08:27 crc kubenswrapper[4749]: I1001 13:08:27.090705 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 13:08:27 crc kubenswrapper[4749]: I1001 13:08:27.187154 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:27 crc kubenswrapper[4749]: I1001 13:08:27.193329 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:08:27 crc kubenswrapper[4749]: I1001 13:08:27.564851 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 13:08:27 crc kubenswrapper[4749]: I1001 13:08:27.564887 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2","Type":"ContainerDied","Data":"1d5d7b10f7eba6b00e568ce908cb37c092d5cd97c8e86d6db8504aee0c881449"} Oct 01 13:08:27 crc kubenswrapper[4749]: I1001 13:08:27.564910 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d5d7b10f7eba6b00e568ce908cb37c092d5cd97c8e86d6db8504aee0c881449" Oct 01 13:08:27 crc kubenswrapper[4749]: I1001 13:08:27.698937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:08:27 crc kubenswrapper[4749]: I1001 13:08:27.704751 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/27497171-a8cc-4282-8ee6-2f68f768fc69-metrics-certs\") pod \"network-metrics-daemon-mwlpq\" (UID: \"27497171-a8cc-4282-8ee6-2f68f768fc69\") " pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:08:27 crc kubenswrapper[4749]: I1001 13:08:27.975239 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mwlpq" Oct 01 13:08:32 crc kubenswrapper[4749]: I1001 13:08:32.107279 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:08:32 crc kubenswrapper[4749]: I1001 13:08:32.107697 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:08:34 crc kubenswrapper[4749]: I1001 13:08:34.272873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:08:46 crc kubenswrapper[4749]: I1001 13:08:46.717700 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tlpbw" Oct 01 13:08:50 crc kubenswrapper[4749]: E1001 13:08:50.609821 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 13:08:50 crc kubenswrapper[4749]: E1001 13:08:50.610407 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4qcsj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-l7q4w_openshift-marketplace(4e36eb4c-e668-403b-93d8-1940029337fc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:08:50 crc kubenswrapper[4749]: E1001 13:08:50.611868 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-l7q4w" podUID="4e36eb4c-e668-403b-93d8-1940029337fc" Oct 01 13:08:51 crc kubenswrapper[4749]: E1001 13:08:51.285948 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-l7q4w" podUID="4e36eb4c-e668-403b-93d8-1940029337fc" Oct 01 13:08:51 crc kubenswrapper[4749]: E1001 13:08:51.355855 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 13:08:51 crc kubenswrapper[4749]: E1001 13:08:51.356033 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dbm2g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2vddf_openshift-marketplace(2016f7a2-9df5-474b-92e4-1bf3cd8175e3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:08:51 crc kubenswrapper[4749]: E1001 13:08:51.357212 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2vddf" podUID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" Oct 01 13:08:52 crc kubenswrapper[4749]: I1001 13:08:52.366442 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.499086 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2vddf" podUID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.556878 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.557343 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wndc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kw79f_openshift-marketplace(1ad908df-119f-4496-a55d-64eb43918142): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.558636 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kw79f" podUID="1ad908df-119f-4496-a55d-64eb43918142" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.577158 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.577271 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wxl72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xp57r_openshift-marketplace(40e79171-b4ab-495a-af2b-44ea7f6c91ef): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.578501 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-xp57r" podUID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.595638 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.595727 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mv6kd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-gzxsg_openshift-marketplace(6c9a345f-5d20-4673-a6bb-e70e387e5d8f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.597841 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-gzxsg" podUID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.605375 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.605475 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-szhzb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-sbgjp_openshift-marketplace(73c319a1-ef16-4c75-b18b-0fed4fba7fb7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:08:52 crc kubenswrapper[4749]: E1001 13:08:52.606714 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-sbgjp" podUID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" Oct 01 13:08:55 crc kubenswrapper[4749]: E1001 13:08:55.530643 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-gzxsg" podUID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" Oct 01 13:08:55 crc kubenswrapper[4749]: E1001 13:08:55.530721 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xp57r" podUID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" Oct 01 13:08:55 crc kubenswrapper[4749]: E1001 13:08:55.530825 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-sbgjp" podUID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" Oct 01 13:08:55 crc kubenswrapper[4749]: E1001 13:08:55.530871 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-kw79f" podUID="1ad908df-119f-4496-a55d-64eb43918142" Oct 01 13:08:55 crc kubenswrapper[4749]: E1001 13:08:55.608965 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 13:08:55 crc kubenswrapper[4749]: E1001 13:08:55.610766 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7h6jl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-2nbq9_openshift-marketplace(6f96d1ba-50b5-4f78-83c6-a97f3617ac75): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:08:55 crc kubenswrapper[4749]: E1001 13:08:55.612634 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-2nbq9" podUID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" Oct 01 13:08:55 crc kubenswrapper[4749]: E1001 13:08:55.639820 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 13:08:55 crc kubenswrapper[4749]: E1001 13:08:55.640041 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nbl4r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dkxgb_openshift-marketplace(186102be-9b7e-4807-a141-6c85d69155df): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:08:55 crc kubenswrapper[4749]: E1001 13:08:55.641274 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dkxgb" podUID="186102be-9b7e-4807-a141-6c85d69155df" Oct 01 13:08:55 crc kubenswrapper[4749]: E1001 13:08:55.743721 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dkxgb" podUID="186102be-9b7e-4807-a141-6c85d69155df" Oct 01 13:08:55 crc kubenswrapper[4749]: E1001 13:08:55.743748 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-2nbq9" podUID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" Oct 01 13:08:55 crc kubenswrapper[4749]: I1001 13:08:55.967535 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mwlpq"] Oct 01 13:08:55 crc kubenswrapper[4749]: W1001 13:08:55.975638 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27497171_a8cc_4282_8ee6_2f68f768fc69.slice/crio-70f6d97c806572b77f26ea620850856babdf5d34b191f77415e0ffcff0048dfc WatchSource:0}: Error finding container 70f6d97c806572b77f26ea620850856babdf5d34b191f77415e0ffcff0048dfc: Status 404 returned error can't find the container with id 70f6d97c806572b77f26ea620850856babdf5d34b191f77415e0ffcff0048dfc Oct 01 13:08:56 crc kubenswrapper[4749]: I1001 13:08:56.750813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" event={"ID":"27497171-a8cc-4282-8ee6-2f68f768fc69","Type":"ContainerStarted","Data":"9a7b55331df0aab48afb41ec4e6a61322d77c50d43e2fc1b44fab223e14b651a"} Oct 01 13:08:56 crc kubenswrapper[4749]: I1001 13:08:56.751374 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" event={"ID":"27497171-a8cc-4282-8ee6-2f68f768fc69","Type":"ContainerStarted","Data":"70f6d97c806572b77f26ea620850856babdf5d34b191f77415e0ffcff0048dfc"} Oct 01 13:08:57 crc kubenswrapper[4749]: I1001 13:08:57.761362 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mwlpq" event={"ID":"27497171-a8cc-4282-8ee6-2f68f768fc69","Type":"ContainerStarted","Data":"8131e25730063d0205a3f017adfff96a3a37b32fc4e15b4c23a5fded87088a8f"} Oct 01 13:08:57 crc kubenswrapper[4749]: I1001 13:08:57.783697 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mwlpq" podStartSLOduration=173.783677283 podStartE2EDuration="2m53.783677283s" podCreationTimestamp="2025-10-01 13:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:08:57.780102909 +0000 UTC m=+197.834087808" watchObservedRunningTime="2025-10-01 13:08:57.783677283 +0000 UTC m=+197.837662182" Oct 01 13:09:02 crc kubenswrapper[4749]: I1001 13:09:02.107419 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:09:02 crc kubenswrapper[4749]: I1001 13:09:02.107745 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:09:06 crc kubenswrapper[4749]: I1001 13:09:06.870316 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e36eb4c-e668-403b-93d8-1940029337fc" containerID="f6425c3375cc7d177b2d6900b8624543d5183252394810552037a1796db3bb4c" exitCode=0 Oct 01 13:09:06 crc kubenswrapper[4749]: I1001 13:09:06.870413 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7q4w" event={"ID":"4e36eb4c-e668-403b-93d8-1940029337fc","Type":"ContainerDied","Data":"f6425c3375cc7d177b2d6900b8624543d5183252394810552037a1796db3bb4c"} Oct 01 13:09:07 crc kubenswrapper[4749]: I1001 13:09:07.879560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7q4w" event={"ID":"4e36eb4c-e668-403b-93d8-1940029337fc","Type":"ContainerStarted","Data":"b6bdf48c0652f72ae25621e8acdb358c370d10e24d94903c2562b7cfa7257a96"} Oct 01 13:09:07 crc kubenswrapper[4749]: I1001 13:09:07.882882 4749 generic.go:334] "Generic (PLEG): container finished" podID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" containerID="d62af885e013b5bd5478dbda934e83a694e6db14fc0a4a9817e22276445b7cf6" exitCode=0 Oct 01 13:09:07 crc kubenswrapper[4749]: I1001 13:09:07.882918 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgjp" event={"ID":"73c319a1-ef16-4c75-b18b-0fed4fba7fb7","Type":"ContainerDied","Data":"d62af885e013b5bd5478dbda934e83a694e6db14fc0a4a9817e22276445b7cf6"} Oct 01 13:09:07 crc kubenswrapper[4749]: I1001 13:09:07.884693 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" containerID="96e7bae878b180e4fc1fa55288a793cc58b0fd5bea55a40a72c5fd6f1b4472c0" exitCode=0 Oct 01 13:09:07 crc kubenswrapper[4749]: I1001 13:09:07.884722 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nbq9" event={"ID":"6f96d1ba-50b5-4f78-83c6-a97f3617ac75","Type":"ContainerDied","Data":"96e7bae878b180e4fc1fa55288a793cc58b0fd5bea55a40a72c5fd6f1b4472c0"} Oct 01 13:09:07 crc kubenswrapper[4749]: I1001 13:09:07.894547 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" containerID="a23b3f7f8ba0bd15d73f4dfc6dd13af0e81bf37dbb8309d1900967b6ac436d7b" exitCode=0 Oct 01 13:09:07 crc kubenswrapper[4749]: I1001 13:09:07.894593 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzxsg" event={"ID":"6c9a345f-5d20-4673-a6bb-e70e387e5d8f","Type":"ContainerDied","Data":"a23b3f7f8ba0bd15d73f4dfc6dd13af0e81bf37dbb8309d1900967b6ac436d7b"} Oct 01 13:09:07 crc kubenswrapper[4749]: I1001 13:09:07.912486 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l7q4w" podStartSLOduration=3.861039732 podStartE2EDuration="55.912466735s" podCreationTimestamp="2025-10-01 13:08:12 +0000 UTC" firstStartedPulling="2025-10-01 13:08:15.396013034 +0000 UTC m=+155.449997933" lastFinishedPulling="2025-10-01 13:09:07.447440027 +0000 UTC m=+207.501424936" observedRunningTime="2025-10-01 13:09:07.905893644 +0000 UTC m=+207.959878553" watchObservedRunningTime="2025-10-01 13:09:07.912466735 +0000 UTC m=+207.966451644" Oct 01 13:09:08 crc kubenswrapper[4749]: I1001 13:09:08.900902 4749 generic.go:334] "Generic (PLEG): container finished" podID="1ad908df-119f-4496-a55d-64eb43918142" containerID="6235f3ca2251572107f5ee52147f587d52c69d49db4ccf7a25921feefec078b5" exitCode=0 Oct 01 13:09:08 crc kubenswrapper[4749]: I1001 13:09:08.900939 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw79f" event={"ID":"1ad908df-119f-4496-a55d-64eb43918142","Type":"ContainerDied","Data":"6235f3ca2251572107f5ee52147f587d52c69d49db4ccf7a25921feefec078b5"} Oct 01 13:09:08 crc kubenswrapper[4749]: I1001 13:09:08.905388 4749 generic.go:334] "Generic (PLEG): container finished" podID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" containerID="b510306b97c9724af6a85e93c5fd5e6842af07948d1bab663f8c8d5b4137b94b" exitCode=0 Oct 01 13:09:08 crc kubenswrapper[4749]: I1001 13:09:08.905454 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vddf" event={"ID":"2016f7a2-9df5-474b-92e4-1bf3cd8175e3","Type":"ContainerDied","Data":"b510306b97c9724af6a85e93c5fd5e6842af07948d1bab663f8c8d5b4137b94b"} Oct 01 13:09:08 crc kubenswrapper[4749]: I1001 13:09:08.907981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgjp" event={"ID":"73c319a1-ef16-4c75-b18b-0fed4fba7fb7","Type":"ContainerStarted","Data":"e0acadafdd62c886e4f3bf16a58302fd4766c44d11d17cc8c9bf9463ac6d5be7"} Oct 01 13:09:08 crc kubenswrapper[4749]: I1001 13:09:08.912200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nbq9" event={"ID":"6f96d1ba-50b5-4f78-83c6-a97f3617ac75","Type":"ContainerStarted","Data":"c645d5be49b52d44e6d98ec705ae7dd3961d0beb996925988db9b3fcc3d79160"} Oct 01 13:09:08 crc kubenswrapper[4749]: I1001 13:09:08.915078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzxsg" event={"ID":"6c9a345f-5d20-4673-a6bb-e70e387e5d8f","Type":"ContainerStarted","Data":"cab6c44f621abdc232ef646a269307808f105a3c435f1880c327e7cf84e4f04c"} Oct 01 13:09:08 crc kubenswrapper[4749]: I1001 13:09:08.935178 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sbgjp" podStartSLOduration=2.9568737240000003 podStartE2EDuration="56.935165001s" podCreationTimestamp="2025-10-01 13:08:12 +0000 UTC" firstStartedPulling="2025-10-01 13:08:14.343668455 +0000 UTC m=+154.397653364" lastFinishedPulling="2025-10-01 13:09:08.321959742 +0000 UTC m=+208.375944641" observedRunningTime="2025-10-01 13:09:08.934316957 +0000 UTC m=+208.988301866" watchObservedRunningTime="2025-10-01 13:09:08.935165001 +0000 UTC m=+208.989149900" Oct 01 13:09:09 crc kubenswrapper[4749]: I1001 13:09:09.935984 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gzxsg" podStartSLOduration=3.837262893 podStartE2EDuration="55.935968333s" podCreationTimestamp="2025-10-01 13:08:14 +0000 UTC" firstStartedPulling="2025-10-01 13:08:16.4340267 +0000 UTC m=+156.488011599" lastFinishedPulling="2025-10-01 13:09:08.53273213 +0000 UTC m=+208.586717039" observedRunningTime="2025-10-01 13:09:09.934121449 +0000 UTC m=+209.988106348" watchObservedRunningTime="2025-10-01 13:09:09.935968333 +0000 UTC m=+209.989953232" Oct 01 13:09:09 crc kubenswrapper[4749]: I1001 13:09:09.937459 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2nbq9" podStartSLOduration=3.975097613 podStartE2EDuration="54.937453636s" podCreationTimestamp="2025-10-01 13:08:15 +0000 UTC" firstStartedPulling="2025-10-01 13:08:17.449305272 +0000 UTC m=+157.503290171" lastFinishedPulling="2025-10-01 13:09:08.411661285 +0000 UTC m=+208.465646194" observedRunningTime="2025-10-01 13:09:08.97920221 +0000 UTC m=+209.033187109" watchObservedRunningTime="2025-10-01 13:09:09.937453636 +0000 UTC m=+209.991438535" Oct 01 13:09:10 crc kubenswrapper[4749]: I1001 13:09:10.926701 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw79f" event={"ID":"1ad908df-119f-4496-a55d-64eb43918142","Type":"ContainerStarted","Data":"3778c16cbec149685eea1c4b6d799cd77cc505e5a020b4d9db3222445e9cecdc"} Oct 01 13:09:10 crc kubenswrapper[4749]: I1001 13:09:10.929590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vddf" event={"ID":"2016f7a2-9df5-474b-92e4-1bf3cd8175e3","Type":"ContainerStarted","Data":"7aec6076ada0425224930e1d15754278ec95192f0b8075c650ac33baf47121c4"} Oct 01 13:09:10 crc kubenswrapper[4749]: I1001 13:09:10.946725 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kw79f" podStartSLOduration=3.407931909 podStartE2EDuration="57.946712102s" podCreationTimestamp="2025-10-01 13:08:13 +0000 UTC" firstStartedPulling="2025-10-01 13:08:15.389030483 +0000 UTC m=+155.443015392" lastFinishedPulling="2025-10-01 13:09:09.927810686 +0000 UTC m=+209.981795585" observedRunningTime="2025-10-01 13:09:10.944650312 +0000 UTC m=+210.998635211" watchObservedRunningTime="2025-10-01 13:09:10.946712102 +0000 UTC m=+211.000696991" Oct 01 13:09:10 crc kubenswrapper[4749]: I1001 13:09:10.964230 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2vddf" podStartSLOduration=2.50594521 podStartE2EDuration="55.96419808s" podCreationTimestamp="2025-10-01 13:08:15 +0000 UTC" firstStartedPulling="2025-10-01 13:08:16.417957668 +0000 UTC m=+156.471942567" lastFinishedPulling="2025-10-01 13:09:09.876210538 +0000 UTC m=+209.930195437" observedRunningTime="2025-10-01 13:09:10.961269485 +0000 UTC m=+211.015254384" watchObservedRunningTime="2025-10-01 13:09:10.96419808 +0000 UTC m=+211.018182969" Oct 01 13:09:11 crc kubenswrapper[4749]: I1001 13:09:11.944858 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkxgb" event={"ID":"186102be-9b7e-4807-a141-6c85d69155df","Type":"ContainerStarted","Data":"c310c334c42176b266a2e84a036fcd464c1637e718121f1be0c4fe4ae0850dc8"} Oct 01 13:09:11 crc kubenswrapper[4749]: I1001 13:09:11.949168 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp57r" event={"ID":"40e79171-b4ab-495a-af2b-44ea7f6c91ef","Type":"ContainerStarted","Data":"111f0327dbab7bceb217b086703cc87be7d7c86c2a3d75f93f0f9947d373acfc"} Oct 01 13:09:12 crc kubenswrapper[4749]: I1001 13:09:12.955599 4749 generic.go:334] "Generic (PLEG): container finished" podID="186102be-9b7e-4807-a141-6c85d69155df" containerID="c310c334c42176b266a2e84a036fcd464c1637e718121f1be0c4fe4ae0850dc8" exitCode=0 Oct 01 13:09:12 crc kubenswrapper[4749]: I1001 13:09:12.955870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkxgb" event={"ID":"186102be-9b7e-4807-a141-6c85d69155df","Type":"ContainerDied","Data":"c310c334c42176b266a2e84a036fcd464c1637e718121f1be0c4fe4ae0850dc8"} Oct 01 13:09:12 crc kubenswrapper[4749]: I1001 13:09:12.959861 4749 generic.go:334] "Generic (PLEG): container finished" podID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" containerID="111f0327dbab7bceb217b086703cc87be7d7c86c2a3d75f93f0f9947d373acfc" exitCode=0 Oct 01 13:09:12 crc kubenswrapper[4749]: I1001 13:09:12.959905 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp57r" event={"ID":"40e79171-b4ab-495a-af2b-44ea7f6c91ef","Type":"ContainerDied","Data":"111f0327dbab7bceb217b086703cc87be7d7c86c2a3d75f93f0f9947d373acfc"} Oct 01 13:09:13 crc kubenswrapper[4749]: I1001 13:09:13.235873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:09:13 crc kubenswrapper[4749]: I1001 13:09:13.235914 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:09:13 crc kubenswrapper[4749]: I1001 13:09:13.361905 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:09:13 crc kubenswrapper[4749]: I1001 13:09:13.361957 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:09:13 crc kubenswrapper[4749]: I1001 13:09:13.431207 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:09:13 crc kubenswrapper[4749]: I1001 13:09:13.432524 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:09:13 crc kubenswrapper[4749]: I1001 13:09:13.634782 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:09:13 crc kubenswrapper[4749]: I1001 13:09:13.635422 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:09:13 crc kubenswrapper[4749]: I1001 13:09:13.690863 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:09:14 crc kubenswrapper[4749]: I1001 13:09:14.005743 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:09:14 crc kubenswrapper[4749]: I1001 13:09:14.016214 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:09:15 crc kubenswrapper[4749]: I1001 13:09:15.160081 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4zj5j"] Oct 01 13:09:15 crc kubenswrapper[4749]: I1001 13:09:15.312846 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:09:15 crc kubenswrapper[4749]: I1001 13:09:15.312902 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:09:15 crc kubenswrapper[4749]: I1001 13:09:15.353227 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:09:15 crc kubenswrapper[4749]: I1001 13:09:15.714704 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:09:15 crc kubenswrapper[4749]: I1001 13:09:15.714743 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:09:15 crc kubenswrapper[4749]: I1001 13:09:15.807253 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:09:16 crc kubenswrapper[4749]: I1001 13:09:16.014829 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:09:16 crc kubenswrapper[4749]: I1001 13:09:16.045702 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:09:16 crc kubenswrapper[4749]: I1001 13:09:16.334549 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:09:16 crc kubenswrapper[4749]: I1001 13:09:16.334862 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:09:16 crc kubenswrapper[4749]: I1001 13:09:16.388846 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:09:17 crc kubenswrapper[4749]: I1001 13:09:17.044640 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:09:17 crc kubenswrapper[4749]: I1001 13:09:17.987927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkxgb" event={"ID":"186102be-9b7e-4807-a141-6c85d69155df","Type":"ContainerStarted","Data":"c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a"} Oct 01 13:09:19 crc kubenswrapper[4749]: I1001 13:09:19.272496 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dkxgb" podStartSLOduration=5.408870753 podStartE2EDuration="1m3.272458426s" podCreationTimestamp="2025-10-01 13:08:16 +0000 UTC" firstStartedPulling="2025-10-01 13:08:18.485598608 +0000 UTC m=+158.539583507" lastFinishedPulling="2025-10-01 13:09:16.349186281 +0000 UTC m=+216.403171180" observedRunningTime="2025-10-01 13:09:18.022181594 +0000 UTC m=+218.076166533" watchObservedRunningTime="2025-10-01 13:09:19.272458426 +0000 UTC m=+219.326443335" Oct 01 13:09:19 crc kubenswrapper[4749]: I1001 13:09:19.273436 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vddf"] Oct 01 13:09:19 crc kubenswrapper[4749]: I1001 13:09:19.273687 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2vddf" podUID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" containerName="registry-server" containerID="cri-o://7aec6076ada0425224930e1d15754278ec95192f0b8075c650ac33baf47121c4" gracePeriod=2 Oct 01 13:09:21 crc kubenswrapper[4749]: I1001 13:09:21.014370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp57r" event={"ID":"40e79171-b4ab-495a-af2b-44ea7f6c91ef","Type":"ContainerStarted","Data":"16d29e8cf9197d3c440c0b38e0c080a6241330272b5d19710236c59c1801de62"} Oct 01 13:09:21 crc kubenswrapper[4749]: I1001 13:09:21.016539 4749 generic.go:334] "Generic (PLEG): container finished" podID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" containerID="7aec6076ada0425224930e1d15754278ec95192f0b8075c650ac33baf47121c4" exitCode=0 Oct 01 13:09:21 crc kubenswrapper[4749]: I1001 13:09:21.016573 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vddf" event={"ID":"2016f7a2-9df5-474b-92e4-1bf3cd8175e3","Type":"ContainerDied","Data":"7aec6076ada0425224930e1d15754278ec95192f0b8075c650ac33baf47121c4"} Oct 01 13:09:21 crc kubenswrapper[4749]: I1001 13:09:21.877893 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.018508 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbm2g\" (UniqueName: \"kubernetes.io/projected/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-kube-api-access-dbm2g\") pod \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\" (UID: \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\") " Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.018565 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-utilities\") pod \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\" (UID: \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\") " Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.018706 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-catalog-content\") pod \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\" (UID: \"2016f7a2-9df5-474b-92e4-1bf3cd8175e3\") " Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.019475 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-utilities" (OuterVolumeSpecName: "utilities") pod "2016f7a2-9df5-474b-92e4-1bf3cd8175e3" (UID: "2016f7a2-9df5-474b-92e4-1bf3cd8175e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.023447 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2vddf" event={"ID":"2016f7a2-9df5-474b-92e4-1bf3cd8175e3","Type":"ContainerDied","Data":"0d861d7347bf860392d94ceb9a403781af01d3862ee769e41592ed8f4b64f875"} Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.023478 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2vddf" Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.023512 4749 scope.go:117] "RemoveContainer" containerID="7aec6076ada0425224930e1d15754278ec95192f0b8075c650ac33baf47121c4" Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.023964 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-kube-api-access-dbm2g" (OuterVolumeSpecName: "kube-api-access-dbm2g") pod "2016f7a2-9df5-474b-92e4-1bf3cd8175e3" (UID: "2016f7a2-9df5-474b-92e4-1bf3cd8175e3"). InnerVolumeSpecName "kube-api-access-dbm2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.030413 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2016f7a2-9df5-474b-92e4-1bf3cd8175e3" (UID: "2016f7a2-9df5-474b-92e4-1bf3cd8175e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.045724 4749 scope.go:117] "RemoveContainer" containerID="b510306b97c9724af6a85e93c5fd5e6842af07948d1bab663f8c8d5b4137b94b" Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.052192 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xp57r" podStartSLOduration=4.490673691 podStartE2EDuration="1m9.052173044s" podCreationTimestamp="2025-10-01 13:08:13 +0000 UTC" firstStartedPulling="2025-10-01 13:08:15.378018906 +0000 UTC m=+155.432003805" lastFinishedPulling="2025-10-01 13:09:19.939518259 +0000 UTC m=+219.993503158" observedRunningTime="2025-10-01 13:09:22.048831847 +0000 UTC m=+222.102816756" watchObservedRunningTime="2025-10-01 13:09:22.052173044 +0000 UTC m=+222.106157943" Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.061045 4749 scope.go:117] "RemoveContainer" containerID="a1f22af2c4c943a99ad3349ef4ae3aca19ebc0bd25b3de20eadced023cc4ef70" Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.120463 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.120493 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbm2g\" (UniqueName: \"kubernetes.io/projected/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-kube-api-access-dbm2g\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.120503 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2016f7a2-9df5-474b-92e4-1bf3cd8175e3-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.347389 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vddf"] Oct 01 13:09:22 crc kubenswrapper[4749]: I1001 13:09:22.351109 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2vddf"] Oct 01 13:09:23 crc kubenswrapper[4749]: I1001 13:09:23.235310 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" path="/var/lib/kubelet/pods/2016f7a2-9df5-474b-92e4-1bf3cd8175e3/volumes" Oct 01 13:09:23 crc kubenswrapper[4749]: I1001 13:09:23.689889 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:09:23 crc kubenswrapper[4749]: I1001 13:09:23.811839 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:09:23 crc kubenswrapper[4749]: I1001 13:09:23.811903 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:09:23 crc kubenswrapper[4749]: I1001 13:09:23.867946 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:09:26 crc kubenswrapper[4749]: I1001 13:09:26.270326 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kw79f"] Oct 01 13:09:26 crc kubenswrapper[4749]: I1001 13:09:26.271013 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kw79f" podUID="1ad908df-119f-4496-a55d-64eb43918142" containerName="registry-server" containerID="cri-o://3778c16cbec149685eea1c4b6d799cd77cc505e5a020b4d9db3222445e9cecdc" gracePeriod=2 Oct 01 13:09:26 crc kubenswrapper[4749]: I1001 13:09:26.750145 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:09:26 crc kubenswrapper[4749]: I1001 13:09:26.750193 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:09:26 crc kubenswrapper[4749]: I1001 13:09:26.814938 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.051395 4749 generic.go:334] "Generic (PLEG): container finished" podID="1ad908df-119f-4496-a55d-64eb43918142" containerID="3778c16cbec149685eea1c4b6d799cd77cc505e5a020b4d9db3222445e9cecdc" exitCode=0 Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.051470 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw79f" event={"ID":"1ad908df-119f-4496-a55d-64eb43918142","Type":"ContainerDied","Data":"3778c16cbec149685eea1c4b6d799cd77cc505e5a020b4d9db3222445e9cecdc"} Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.100581 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.773464 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.797245 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad908df-119f-4496-a55d-64eb43918142-utilities\") pod \"1ad908df-119f-4496-a55d-64eb43918142\" (UID: \"1ad908df-119f-4496-a55d-64eb43918142\") " Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.797336 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad908df-119f-4496-a55d-64eb43918142-catalog-content\") pod \"1ad908df-119f-4496-a55d-64eb43918142\" (UID: \"1ad908df-119f-4496-a55d-64eb43918142\") " Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.797403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wndc9\" (UniqueName: \"kubernetes.io/projected/1ad908df-119f-4496-a55d-64eb43918142-kube-api-access-wndc9\") pod \"1ad908df-119f-4496-a55d-64eb43918142\" (UID: \"1ad908df-119f-4496-a55d-64eb43918142\") " Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.798176 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ad908df-119f-4496-a55d-64eb43918142-utilities" (OuterVolumeSpecName: "utilities") pod "1ad908df-119f-4496-a55d-64eb43918142" (UID: "1ad908df-119f-4496-a55d-64eb43918142"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.810441 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ad908df-119f-4496-a55d-64eb43918142-kube-api-access-wndc9" (OuterVolumeSpecName: "kube-api-access-wndc9") pod "1ad908df-119f-4496-a55d-64eb43918142" (UID: "1ad908df-119f-4496-a55d-64eb43918142"). InnerVolumeSpecName "kube-api-access-wndc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.841983 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ad908df-119f-4496-a55d-64eb43918142-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ad908df-119f-4496-a55d-64eb43918142" (UID: "1ad908df-119f-4496-a55d-64eb43918142"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.898312 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wndc9\" (UniqueName: \"kubernetes.io/projected/1ad908df-119f-4496-a55d-64eb43918142-kube-api-access-wndc9\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.898346 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad908df-119f-4496-a55d-64eb43918142-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:27 crc kubenswrapper[4749]: I1001 13:09:27.898357 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad908df-119f-4496-a55d-64eb43918142-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:28 crc kubenswrapper[4749]: I1001 13:09:28.058314 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kw79f" Oct 01 13:09:28 crc kubenswrapper[4749]: I1001 13:09:28.058377 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kw79f" event={"ID":"1ad908df-119f-4496-a55d-64eb43918142","Type":"ContainerDied","Data":"c88ee95e01794be93de1e9468a8bb0abef8dd9a780dc9694dcf84d5c62139bc9"} Oct 01 13:09:28 crc kubenswrapper[4749]: I1001 13:09:28.058452 4749 scope.go:117] "RemoveContainer" containerID="3778c16cbec149685eea1c4b6d799cd77cc505e5a020b4d9db3222445e9cecdc" Oct 01 13:09:28 crc kubenswrapper[4749]: I1001 13:09:28.080996 4749 scope.go:117] "RemoveContainer" containerID="6235f3ca2251572107f5ee52147f587d52c69d49db4ccf7a25921feefec078b5" Oct 01 13:09:28 crc kubenswrapper[4749]: I1001 13:09:28.083543 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kw79f"] Oct 01 13:09:28 crc kubenswrapper[4749]: I1001 13:09:28.089105 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kw79f"] Oct 01 13:09:28 crc kubenswrapper[4749]: I1001 13:09:28.112470 4749 scope.go:117] "RemoveContainer" containerID="f0b5ddaf52dfbe67fd329780e8c18e93a003e358af9c53462eb1352762f108e4" Oct 01 13:09:29 crc kubenswrapper[4749]: I1001 13:09:29.235359 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ad908df-119f-4496-a55d-64eb43918142" path="/var/lib/kubelet/pods/1ad908df-119f-4496-a55d-64eb43918142/volumes" Oct 01 13:09:30 crc kubenswrapper[4749]: I1001 13:09:30.675629 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dkxgb"] Oct 01 13:09:30 crc kubenswrapper[4749]: I1001 13:09:30.676151 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dkxgb" podUID="186102be-9b7e-4807-a141-6c85d69155df" containerName="registry-server" containerID="cri-o://c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a" gracePeriod=2 Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.047996 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.084407 4749 generic.go:334] "Generic (PLEG): container finished" podID="186102be-9b7e-4807-a141-6c85d69155df" containerID="c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a" exitCode=0 Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.084449 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkxgb" event={"ID":"186102be-9b7e-4807-a141-6c85d69155df","Type":"ContainerDied","Data":"c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a"} Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.084477 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkxgb" event={"ID":"186102be-9b7e-4807-a141-6c85d69155df","Type":"ContainerDied","Data":"38a649e037ef4cd1a2b760477e4a9397c769eabd63dfcf63aceea01490246820"} Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.084495 4749 scope.go:117] "RemoveContainer" containerID="c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.084516 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkxgb" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.102736 4749 scope.go:117] "RemoveContainer" containerID="c310c334c42176b266a2e84a036fcd464c1637e718121f1be0c4fe4ae0850dc8" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.115152 4749 scope.go:117] "RemoveContainer" containerID="5896b7db6cb0928c743ee418853e11eeb78c9b38c1767a4fa20d7d908fa8d4e2" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.139361 4749 scope.go:117] "RemoveContainer" containerID="c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a" Oct 01 13:09:31 crc kubenswrapper[4749]: E1001 13:09:31.139771 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a\": container with ID starting with c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a not found: ID does not exist" containerID="c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.139857 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a"} err="failed to get container status \"c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a\": rpc error: code = NotFound desc = could not find container \"c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a\": container with ID starting with c155bac275e50f61ec2d82e30a83ed8a0e80cadda7b38cb5eec8eca41ec7aa8a not found: ID does not exist" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.139908 4749 scope.go:117] "RemoveContainer" containerID="c310c334c42176b266a2e84a036fcd464c1637e718121f1be0c4fe4ae0850dc8" Oct 01 13:09:31 crc kubenswrapper[4749]: E1001 13:09:31.140206 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c310c334c42176b266a2e84a036fcd464c1637e718121f1be0c4fe4ae0850dc8\": container with ID starting with c310c334c42176b266a2e84a036fcd464c1637e718121f1be0c4fe4ae0850dc8 not found: ID does not exist" containerID="c310c334c42176b266a2e84a036fcd464c1637e718121f1be0c4fe4ae0850dc8" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.140258 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c310c334c42176b266a2e84a036fcd464c1637e718121f1be0c4fe4ae0850dc8"} err="failed to get container status \"c310c334c42176b266a2e84a036fcd464c1637e718121f1be0c4fe4ae0850dc8\": rpc error: code = NotFound desc = could not find container \"c310c334c42176b266a2e84a036fcd464c1637e718121f1be0c4fe4ae0850dc8\": container with ID starting with c310c334c42176b266a2e84a036fcd464c1637e718121f1be0c4fe4ae0850dc8 not found: ID does not exist" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.140281 4749 scope.go:117] "RemoveContainer" containerID="5896b7db6cb0928c743ee418853e11eeb78c9b38c1767a4fa20d7d908fa8d4e2" Oct 01 13:09:31 crc kubenswrapper[4749]: E1001 13:09:31.140557 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5896b7db6cb0928c743ee418853e11eeb78c9b38c1767a4fa20d7d908fa8d4e2\": container with ID starting with 5896b7db6cb0928c743ee418853e11eeb78c9b38c1767a4fa20d7d908fa8d4e2 not found: ID does not exist" containerID="5896b7db6cb0928c743ee418853e11eeb78c9b38c1767a4fa20d7d908fa8d4e2" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.140582 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5896b7db6cb0928c743ee418853e11eeb78c9b38c1767a4fa20d7d908fa8d4e2"} err="failed to get container status \"5896b7db6cb0928c743ee418853e11eeb78c9b38c1767a4fa20d7d908fa8d4e2\": rpc error: code = NotFound desc = could not find container \"5896b7db6cb0928c743ee418853e11eeb78c9b38c1767a4fa20d7d908fa8d4e2\": container with ID starting with 5896b7db6cb0928c743ee418853e11eeb78c9b38c1767a4fa20d7d908fa8d4e2 not found: ID does not exist" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.248108 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbl4r\" (UniqueName: \"kubernetes.io/projected/186102be-9b7e-4807-a141-6c85d69155df-kube-api-access-nbl4r\") pod \"186102be-9b7e-4807-a141-6c85d69155df\" (UID: \"186102be-9b7e-4807-a141-6c85d69155df\") " Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.248178 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/186102be-9b7e-4807-a141-6c85d69155df-utilities\") pod \"186102be-9b7e-4807-a141-6c85d69155df\" (UID: \"186102be-9b7e-4807-a141-6c85d69155df\") " Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.248236 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/186102be-9b7e-4807-a141-6c85d69155df-catalog-content\") pod \"186102be-9b7e-4807-a141-6c85d69155df\" (UID: \"186102be-9b7e-4807-a141-6c85d69155df\") " Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.249580 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/186102be-9b7e-4807-a141-6c85d69155df-utilities" (OuterVolumeSpecName: "utilities") pod "186102be-9b7e-4807-a141-6c85d69155df" (UID: "186102be-9b7e-4807-a141-6c85d69155df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.250929 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/186102be-9b7e-4807-a141-6c85d69155df-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.254575 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/186102be-9b7e-4807-a141-6c85d69155df-kube-api-access-nbl4r" (OuterVolumeSpecName: "kube-api-access-nbl4r") pod "186102be-9b7e-4807-a141-6c85d69155df" (UID: "186102be-9b7e-4807-a141-6c85d69155df"). InnerVolumeSpecName "kube-api-access-nbl4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.338388 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/186102be-9b7e-4807-a141-6c85d69155df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "186102be-9b7e-4807-a141-6c85d69155df" (UID: "186102be-9b7e-4807-a141-6c85d69155df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.352734 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbl4r\" (UniqueName: \"kubernetes.io/projected/186102be-9b7e-4807-a141-6c85d69155df-kube-api-access-nbl4r\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.352767 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/186102be-9b7e-4807-a141-6c85d69155df-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.409392 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dkxgb"] Oct 01 13:09:31 crc kubenswrapper[4749]: I1001 13:09:31.415927 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dkxgb"] Oct 01 13:09:32 crc kubenswrapper[4749]: I1001 13:09:32.106716 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:09:32 crc kubenswrapper[4749]: I1001 13:09:32.106767 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:09:32 crc kubenswrapper[4749]: I1001 13:09:32.106812 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:09:32 crc kubenswrapper[4749]: I1001 13:09:32.107376 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:09:32 crc kubenswrapper[4749]: I1001 13:09:32.107423 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5" gracePeriod=600 Oct 01 13:09:33 crc kubenswrapper[4749]: I1001 13:09:33.096572 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5" exitCode=0 Oct 01 13:09:33 crc kubenswrapper[4749]: I1001 13:09:33.096599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5"} Oct 01 13:09:33 crc kubenswrapper[4749]: I1001 13:09:33.096937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"6a452f1a75cf05bf9b08e26d6af338a726d9bc5128fc085f221e68420c78c125"} Oct 01 13:09:33 crc kubenswrapper[4749]: I1001 13:09:33.245314 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="186102be-9b7e-4807-a141-6c85d69155df" path="/var/lib/kubelet/pods/186102be-9b7e-4807-a141-6c85d69155df/volumes" Oct 01 13:09:33 crc kubenswrapper[4749]: I1001 13:09:33.848105 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:09:35 crc kubenswrapper[4749]: I1001 13:09:35.673832 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xp57r"] Oct 01 13:09:35 crc kubenswrapper[4749]: I1001 13:09:35.674282 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xp57r" podUID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" containerName="registry-server" containerID="cri-o://16d29e8cf9197d3c440c0b38e0c080a6241330272b5d19710236c59c1801de62" gracePeriod=2 Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.113642 4749 generic.go:334] "Generic (PLEG): container finished" podID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" containerID="16d29e8cf9197d3c440c0b38e0c080a6241330272b5d19710236c59c1801de62" exitCode=0 Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.113690 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp57r" event={"ID":"40e79171-b4ab-495a-af2b-44ea7f6c91ef","Type":"ContainerDied","Data":"16d29e8cf9197d3c440c0b38e0c080a6241330272b5d19710236c59c1801de62"} Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.113719 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xp57r" event={"ID":"40e79171-b4ab-495a-af2b-44ea7f6c91ef","Type":"ContainerDied","Data":"d2c6042186b9ce44eb34eda60432437ad00d9cce4c3bb3e97c67c25aeae49946"} Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.113731 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2c6042186b9ce44eb34eda60432437ad00d9cce4c3bb3e97c67c25aeae49946" Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.148482 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.315919 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxl72\" (UniqueName: \"kubernetes.io/projected/40e79171-b4ab-495a-af2b-44ea7f6c91ef-kube-api-access-wxl72\") pod \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\" (UID: \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\") " Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.315976 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e79171-b4ab-495a-af2b-44ea7f6c91ef-catalog-content\") pod \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\" (UID: \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\") " Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.316031 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e79171-b4ab-495a-af2b-44ea7f6c91ef-utilities\") pod \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\" (UID: \"40e79171-b4ab-495a-af2b-44ea7f6c91ef\") " Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.316953 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40e79171-b4ab-495a-af2b-44ea7f6c91ef-utilities" (OuterVolumeSpecName: "utilities") pod "40e79171-b4ab-495a-af2b-44ea7f6c91ef" (UID: "40e79171-b4ab-495a-af2b-44ea7f6c91ef"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.323364 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e79171-b4ab-495a-af2b-44ea7f6c91ef-kube-api-access-wxl72" (OuterVolumeSpecName: "kube-api-access-wxl72") pod "40e79171-b4ab-495a-af2b-44ea7f6c91ef" (UID: "40e79171-b4ab-495a-af2b-44ea7f6c91ef"). InnerVolumeSpecName "kube-api-access-wxl72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.369869 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40e79171-b4ab-495a-af2b-44ea7f6c91ef-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40e79171-b4ab-495a-af2b-44ea7f6c91ef" (UID: "40e79171-b4ab-495a-af2b-44ea7f6c91ef"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.417516 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxl72\" (UniqueName: \"kubernetes.io/projected/40e79171-b4ab-495a-af2b-44ea7f6c91ef-kube-api-access-wxl72\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.417551 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40e79171-b4ab-495a-af2b-44ea7f6c91ef-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:36 crc kubenswrapper[4749]: I1001 13:09:36.417560 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40e79171-b4ab-495a-af2b-44ea7f6c91ef-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:37 crc kubenswrapper[4749]: I1001 13:09:37.119885 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xp57r" Oct 01 13:09:37 crc kubenswrapper[4749]: I1001 13:09:37.164827 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xp57r"] Oct 01 13:09:37 crc kubenswrapper[4749]: I1001 13:09:37.175742 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xp57r"] Oct 01 13:09:37 crc kubenswrapper[4749]: I1001 13:09:37.245391 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" path="/var/lib/kubelet/pods/40e79171-b4ab-495a-af2b-44ea7f6c91ef/volumes" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.194309 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" podUID="1e490c27-453c-4b8b-8a27-f446aee2178b" containerName="oauth-openshift" containerID="cri-o://6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b" gracePeriod=15 Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.681594 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.778907 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-router-certs\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.779015 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-login\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.779089 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-trusted-ca-bundle\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.779150 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-error\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.779411 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-provider-selection\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.779547 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-audit-policies\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.779708 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-cliconfig\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.779806 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-ocp-branding-template\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.779905 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-idp-0-file-data\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.779995 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e490c27-453c-4b8b-8a27-f446aee2178b-audit-dir\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.780038 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-session\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.780134 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-service-ca\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.780309 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwjwk\" (UniqueName: \"kubernetes.io/projected/1e490c27-453c-4b8b-8a27-f446aee2178b-kube-api-access-xwjwk\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.780402 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e490c27-453c-4b8b-8a27-f446aee2178b-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.780506 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-serving-cert\") pod \"1e490c27-453c-4b8b-8a27-f446aee2178b\" (UID: \"1e490c27-453c-4b8b-8a27-f446aee2178b\") " Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.781363 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.781440 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.781946 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.781928 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.782385 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.782422 4749 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.782446 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.782467 4749 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e490c27-453c-4b8b-8a27-f446aee2178b-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.782487 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.798632 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e490c27-453c-4b8b-8a27-f446aee2178b-kube-api-access-xwjwk" (OuterVolumeSpecName: "kube-api-access-xwjwk") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "kube-api-access-xwjwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.798782 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.799109 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.803519 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.804384 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.804971 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.805122 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.806035 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.811586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1e490c27-453c-4b8b-8a27-f446aee2178b" (UID: "1e490c27-453c-4b8b-8a27-f446aee2178b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.884129 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.884182 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.884197 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.884207 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.884249 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwjwk\" (UniqueName: \"kubernetes.io/projected/1e490c27-453c-4b8b-8a27-f446aee2178b-kube-api-access-xwjwk\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.884259 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.884268 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.884279 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:40 crc kubenswrapper[4749]: I1001 13:09:40.884289 4749 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e490c27-453c-4b8b-8a27-f446aee2178b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 13:09:41 crc kubenswrapper[4749]: I1001 13:09:41.147930 4749 generic.go:334] "Generic (PLEG): container finished" podID="1e490c27-453c-4b8b-8a27-f446aee2178b" containerID="6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b" exitCode=0 Oct 01 13:09:41 crc kubenswrapper[4749]: I1001 13:09:41.147987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" event={"ID":"1e490c27-453c-4b8b-8a27-f446aee2178b","Type":"ContainerDied","Data":"6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b"} Oct 01 13:09:41 crc kubenswrapper[4749]: I1001 13:09:41.148012 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" Oct 01 13:09:41 crc kubenswrapper[4749]: I1001 13:09:41.148045 4749 scope.go:117] "RemoveContainer" containerID="6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b" Oct 01 13:09:41 crc kubenswrapper[4749]: I1001 13:09:41.148027 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4zj5j" event={"ID":"1e490c27-453c-4b8b-8a27-f446aee2178b","Type":"ContainerDied","Data":"04e397cc859ef10872adbb5e481ca422ae1865155a0f26e46e6989960bcc0951"} Oct 01 13:09:41 crc kubenswrapper[4749]: I1001 13:09:41.190364 4749 scope.go:117] "RemoveContainer" containerID="6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b" Oct 01 13:09:41 crc kubenswrapper[4749]: E1001 13:09:41.190946 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b\": container with ID starting with 6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b not found: ID does not exist" containerID="6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b" Oct 01 13:09:41 crc kubenswrapper[4749]: I1001 13:09:41.191000 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b"} err="failed to get container status \"6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b\": rpc error: code = NotFound desc = could not find container \"6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b\": container with ID starting with 6f856e8d6589f0cbc4547f642c4eb5d8217494d984947e4215d71b09a407868b not found: ID does not exist" Oct 01 13:09:41 crc kubenswrapper[4749]: I1001 13:09:41.192312 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4zj5j"] Oct 01 13:09:41 crc kubenswrapper[4749]: I1001 13:09:41.198590 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4zj5j"] Oct 01 13:09:41 crc kubenswrapper[4749]: I1001 13:09:41.242802 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e490c27-453c-4b8b-8a27-f446aee2178b" path="/var/lib/kubelet/pods/1e490c27-453c-4b8b-8a27-f446aee2178b/volumes" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.462370 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-57569d6b9d-xdshg"] Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.463537 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad908df-119f-4496-a55d-64eb43918142" containerName="extract-utilities" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.463565 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad908df-119f-4496-a55d-64eb43918142" containerName="extract-utilities" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.463590 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" containerName="registry-server" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.463605 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" containerName="registry-server" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.463625 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e490c27-453c-4b8b-8a27-f446aee2178b" containerName="oauth-openshift" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.463640 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e490c27-453c-4b8b-8a27-f446aee2178b" containerName="oauth-openshift" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.463658 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" containerName="registry-server" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.463676 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" containerName="registry-server" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.463698 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" containerName="extract-content" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.463712 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" containerName="extract-content" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.463735 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f88ca20-b404-4d0d-8b3d-dd9171928e81" containerName="pruner" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.463751 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f88ca20-b404-4d0d-8b3d-dd9171928e81" containerName="pruner" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.463769 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5321a24-d271-46cb-9d0a-fde8089a6ddc" containerName="collect-profiles" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.463784 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5321a24-d271-46cb-9d0a-fde8089a6ddc" containerName="collect-profiles" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.463871 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2" containerName="pruner" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.463889 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2" containerName="pruner" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.463909 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" containerName="extract-utilities" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.463924 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" containerName="extract-utilities" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.463951 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="186102be-9b7e-4807-a141-6c85d69155df" containerName="registry-server" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.463965 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="186102be-9b7e-4807-a141-6c85d69155df" containerName="registry-server" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.463987 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" containerName="extract-content" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464002 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" containerName="extract-content" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.464024 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad908df-119f-4496-a55d-64eb43918142" containerName="registry-server" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464039 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad908df-119f-4496-a55d-64eb43918142" containerName="registry-server" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.464064 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad908df-119f-4496-a55d-64eb43918142" containerName="extract-content" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464079 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad908df-119f-4496-a55d-64eb43918142" containerName="extract-content" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.464101 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" containerName="extract-utilities" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464116 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" containerName="extract-utilities" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.464138 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="186102be-9b7e-4807-a141-6c85d69155df" containerName="extract-utilities" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464153 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="186102be-9b7e-4807-a141-6c85d69155df" containerName="extract-utilities" Oct 01 13:09:50 crc kubenswrapper[4749]: E1001 13:09:50.464173 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="186102be-9b7e-4807-a141-6c85d69155df" containerName="extract-content" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464189 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="186102be-9b7e-4807-a141-6c85d69155df" containerName="extract-content" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464453 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="186102be-9b7e-4807-a141-6c85d69155df" containerName="registry-server" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464488 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f88ca20-b404-4d0d-8b3d-dd9171928e81" containerName="pruner" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464514 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ad908df-119f-4496-a55d-64eb43918142" containerName="registry-server" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464537 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5321a24-d271-46cb-9d0a-fde8089a6ddc" containerName="collect-profiles" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464559 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d4d92f5-8b12-49f4-a9ad-3190ad5a29d2" containerName="pruner" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464585 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e490c27-453c-4b8b-8a27-f446aee2178b" containerName="oauth-openshift" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464610 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e79171-b4ab-495a-af2b-44ea7f6c91ef" containerName="registry-server" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.464632 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2016f7a2-9df5-474b-92e4-1bf3cd8175e3" containerName="registry-server" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.465393 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.470913 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.472808 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.473899 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.475763 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.476886 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.477082 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.477333 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.477482 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.477588 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.479609 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.479870 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.480148 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.484341 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.491792 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57569d6b9d-xdshg"] Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.504675 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.534571 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.548610 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1480417-9dc3-4b37-8388-de59a2ca9539-audit-dir\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.548685 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.548732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-session\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.548775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-service-ca\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.548843 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.548890 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.548957 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c1480417-9dc3-4b37-8388-de59a2ca9539-audit-policies\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.549028 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.549074 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.549152 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-user-template-error\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.549186 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk67g\" (UniqueName: \"kubernetes.io/projected/c1480417-9dc3-4b37-8388-de59a2ca9539-kube-api-access-xk67g\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.549255 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.549306 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-router-certs\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.549362 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-user-template-login\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.649821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.649873 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.649905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-user-template-error\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.649922 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk67g\" (UniqueName: \"kubernetes.io/projected/c1480417-9dc3-4b37-8388-de59a2ca9539-kube-api-access-xk67g\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.649940 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.649960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-router-certs\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.649979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-user-template-login\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.649997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1480417-9dc3-4b37-8388-de59a2ca9539-audit-dir\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.650014 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.650031 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-session\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.650049 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-service-ca\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.650073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.650089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.650115 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c1480417-9dc3-4b37-8388-de59a2ca9539-audit-policies\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.650905 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c1480417-9dc3-4b37-8388-de59a2ca9539-audit-policies\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.650958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c1480417-9dc3-4b37-8388-de59a2ca9539-audit-dir\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.652917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.653982 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-service-ca\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.654510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.655688 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.655834 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-router-certs\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.655908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.656329 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-user-template-error\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.657391 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.662714 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.662953 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-user-template-login\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.666835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c1480417-9dc3-4b37-8388-de59a2ca9539-v4-0-config-system-session\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.681965 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk67g\" (UniqueName: \"kubernetes.io/projected/c1480417-9dc3-4b37-8388-de59a2ca9539-kube-api-access-xk67g\") pod \"oauth-openshift-57569d6b9d-xdshg\" (UID: \"c1480417-9dc3-4b37-8388-de59a2ca9539\") " pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:50 crc kubenswrapper[4749]: I1001 13:09:50.820981 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:51 crc kubenswrapper[4749]: I1001 13:09:51.093648 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57569d6b9d-xdshg"] Oct 01 13:09:51 crc kubenswrapper[4749]: I1001 13:09:51.239880 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" event={"ID":"c1480417-9dc3-4b37-8388-de59a2ca9539","Type":"ContainerStarted","Data":"0991a259444c4df9d8c9bf699388219cd1a3469326d2c816b3b2b50e5508fc8c"} Oct 01 13:09:52 crc kubenswrapper[4749]: I1001 13:09:52.244185 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" event={"ID":"c1480417-9dc3-4b37-8388-de59a2ca9539","Type":"ContainerStarted","Data":"bc6961a53db21db25c8d25e719156ccc945125744ec0119608aa3a54adf48781"} Oct 01 13:09:52 crc kubenswrapper[4749]: I1001 13:09:52.244791 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:52 crc kubenswrapper[4749]: I1001 13:09:52.253112 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" Oct 01 13:09:52 crc kubenswrapper[4749]: I1001 13:09:52.275431 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-57569d6b9d-xdshg" podStartSLOduration=37.275403402 podStartE2EDuration="37.275403402s" podCreationTimestamp="2025-10-01 13:09:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:09:52.273911319 +0000 UTC m=+252.327896288" watchObservedRunningTime="2025-10-01 13:09:52.275403402 +0000 UTC m=+252.329388341" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.048898 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l7q4w"] Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.050110 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l7q4w" podUID="4e36eb4c-e668-403b-93d8-1940029337fc" containerName="registry-server" containerID="cri-o://b6bdf48c0652f72ae25621e8acdb358c370d10e24d94903c2562b7cfa7257a96" gracePeriod=30 Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.053520 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sbgjp"] Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.054277 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sbgjp" podUID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" containerName="registry-server" containerID="cri-o://e0acadafdd62c886e4f3bf16a58302fd4766c44d11d17cc8c9bf9463ac6d5be7" gracePeriod=30 Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.070386 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4kx2"] Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.070707 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" podUID="85af42c2-35fc-4545-8fb2-22ab8beb3e22" containerName="marketplace-operator" containerID="cri-o://3bb94c31a6861b3dccacd1c1666dded2341d1921b17ee74d1c61c33914f608e6" gracePeriod=30 Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.091200 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzxsg"] Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.091853 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gzxsg" podUID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" containerName="registry-server" containerID="cri-o://cab6c44f621abdc232ef646a269307808f105a3c435f1880c327e7cf84e4f04c" gracePeriod=30 Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.100907 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rc7nj"] Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.101782 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.110212 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2nbq9"] Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.110493 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2nbq9" podUID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" containerName="registry-server" containerID="cri-o://c645d5be49b52d44e6d98ec705ae7dd3961d0beb996925988db9b3fcc3d79160" gracePeriod=30 Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.116559 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rc7nj"] Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.117419 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87ce5873-e490-470b-8324-be053c551acb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rc7nj\" (UID: \"87ce5873-e490-470b-8324-be053c551acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.117472 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z65f\" (UniqueName: \"kubernetes.io/projected/87ce5873-e490-470b-8324-be053c551acb-kube-api-access-5z65f\") pod \"marketplace-operator-79b997595-rc7nj\" (UID: \"87ce5873-e490-470b-8324-be053c551acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.117504 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87ce5873-e490-470b-8324-be053c551acb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rc7nj\" (UID: \"87ce5873-e490-470b-8324-be053c551acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.219411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87ce5873-e490-470b-8324-be053c551acb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rc7nj\" (UID: \"87ce5873-e490-470b-8324-be053c551acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.219492 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z65f\" (UniqueName: \"kubernetes.io/projected/87ce5873-e490-470b-8324-be053c551acb-kube-api-access-5z65f\") pod \"marketplace-operator-79b997595-rc7nj\" (UID: \"87ce5873-e490-470b-8324-be053c551acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.219536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87ce5873-e490-470b-8324-be053c551acb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rc7nj\" (UID: \"87ce5873-e490-470b-8324-be053c551acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.221537 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87ce5873-e490-470b-8324-be053c551acb-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rc7nj\" (UID: \"87ce5873-e490-470b-8324-be053c551acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.233276 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87ce5873-e490-470b-8324-be053c551acb-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rc7nj\" (UID: \"87ce5873-e490-470b-8324-be053c551acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.252413 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z65f\" (UniqueName: \"kubernetes.io/projected/87ce5873-e490-470b-8324-be053c551acb-kube-api-access-5z65f\") pod \"marketplace-operator-79b997595-rc7nj\" (UID: \"87ce5873-e490-470b-8324-be053c551acb\") " pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.436582 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e36eb4c-e668-403b-93d8-1940029337fc" containerID="b6bdf48c0652f72ae25621e8acdb358c370d10e24d94903c2562b7cfa7257a96" exitCode=0 Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.437001 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7q4w" event={"ID":"4e36eb4c-e668-403b-93d8-1940029337fc","Type":"ContainerDied","Data":"b6bdf48c0652f72ae25621e8acdb358c370d10e24d94903c2562b7cfa7257a96"} Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.454583 4749 generic.go:334] "Generic (PLEG): container finished" podID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" containerID="e0acadafdd62c886e4f3bf16a58302fd4766c44d11d17cc8c9bf9463ac6d5be7" exitCode=0 Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.454640 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgjp" event={"ID":"73c319a1-ef16-4c75-b18b-0fed4fba7fb7","Type":"ContainerDied","Data":"e0acadafdd62c886e4f3bf16a58302fd4766c44d11d17cc8c9bf9463ac6d5be7"} Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.454665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sbgjp" event={"ID":"73c319a1-ef16-4c75-b18b-0fed4fba7fb7","Type":"ContainerDied","Data":"bd2ccbab8584bd1a7fc05d16bbcef4cbdef9d0852aa32850a6af5bb508a6924f"} Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.454677 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd2ccbab8584bd1a7fc05d16bbcef4cbdef9d0852aa32850a6af5bb508a6924f" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.466073 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" containerID="c645d5be49b52d44e6d98ec705ae7dd3961d0beb996925988db9b3fcc3d79160" exitCode=0 Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.466156 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nbq9" event={"ID":"6f96d1ba-50b5-4f78-83c6-a97f3617ac75","Type":"ContainerDied","Data":"c645d5be49b52d44e6d98ec705ae7dd3961d0beb996925988db9b3fcc3d79160"} Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.525178 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.526456 4749 generic.go:334] "Generic (PLEG): container finished" podID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" containerID="cab6c44f621abdc232ef646a269307808f105a3c435f1880c327e7cf84e4f04c" exitCode=0 Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.526519 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzxsg" event={"ID":"6c9a345f-5d20-4673-a6bb-e70e387e5d8f","Type":"ContainerDied","Data":"cab6c44f621abdc232ef646a269307808f105a3c435f1880c327e7cf84e4f04c"} Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.527848 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.530683 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" event={"ID":"85af42c2-35fc-4545-8fb2-22ab8beb3e22","Type":"ContainerDied","Data":"3bb94c31a6861b3dccacd1c1666dded2341d1921b17ee74d1c61c33914f608e6"} Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.531778 4749 generic.go:334] "Generic (PLEG): container finished" podID="85af42c2-35fc-4545-8fb2-22ab8beb3e22" containerID="3bb94c31a6861b3dccacd1c1666dded2341d1921b17ee74d1c61c33914f608e6" exitCode=0 Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.555501 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.560100 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.579550 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.628463 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-catalog-content\") pod \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\" (UID: \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.628642 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85af42c2-35fc-4545-8fb2-22ab8beb3e22-marketplace-operator-metrics\") pod \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\" (UID: \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.628683 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv6kd\" (UniqueName: \"kubernetes.io/projected/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-kube-api-access-mv6kd\") pod \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\" (UID: \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.628709 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdnh7\" (UniqueName: \"kubernetes.io/projected/85af42c2-35fc-4545-8fb2-22ab8beb3e22-kube-api-access-wdnh7\") pod \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\" (UID: \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.628749 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e36eb4c-e668-403b-93d8-1940029337fc-utilities\") pod \"4e36eb4c-e668-403b-93d8-1940029337fc\" (UID: \"4e36eb4c-e668-403b-93d8-1940029337fc\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.628785 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-utilities\") pod \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\" (UID: \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.629010 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-utilities\") pod \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\" (UID: \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.629032 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-catalog-content\") pod \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\" (UID: \"6c9a345f-5d20-4673-a6bb-e70e387e5d8f\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.629093 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szhzb\" (UniqueName: \"kubernetes.io/projected/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-kube-api-access-szhzb\") pod \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\" (UID: \"73c319a1-ef16-4c75-b18b-0fed4fba7fb7\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.629120 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85af42c2-35fc-4545-8fb2-22ab8beb3e22-marketplace-trusted-ca\") pod \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\" (UID: \"85af42c2-35fc-4545-8fb2-22ab8beb3e22\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.629146 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qcsj\" (UniqueName: \"kubernetes.io/projected/4e36eb4c-e668-403b-93d8-1940029337fc-kube-api-access-4qcsj\") pod \"4e36eb4c-e668-403b-93d8-1940029337fc\" (UID: \"4e36eb4c-e668-403b-93d8-1940029337fc\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.629179 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e36eb4c-e668-403b-93d8-1940029337fc-catalog-content\") pod \"4e36eb4c-e668-403b-93d8-1940029337fc\" (UID: \"4e36eb4c-e668-403b-93d8-1940029337fc\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.634704 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-utilities" (OuterVolumeSpecName: "utilities") pod "6c9a345f-5d20-4673-a6bb-e70e387e5d8f" (UID: "6c9a345f-5d20-4673-a6bb-e70e387e5d8f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.634795 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.635378 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e36eb4c-e668-403b-93d8-1940029337fc-utilities" (OuterVolumeSpecName: "utilities") pod "4e36eb4c-e668-403b-93d8-1940029337fc" (UID: "4e36eb4c-e668-403b-93d8-1940029337fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.639159 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-utilities" (OuterVolumeSpecName: "utilities") pod "73c319a1-ef16-4c75-b18b-0fed4fba7fb7" (UID: "73c319a1-ef16-4c75-b18b-0fed4fba7fb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.645087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85af42c2-35fc-4545-8fb2-22ab8beb3e22-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "85af42c2-35fc-4545-8fb2-22ab8beb3e22" (UID: "85af42c2-35fc-4545-8fb2-22ab8beb3e22"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.647587 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c9a345f-5d20-4673-a6bb-e70e387e5d8f" (UID: "6c9a345f-5d20-4673-a6bb-e70e387e5d8f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.659468 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-kube-api-access-mv6kd" (OuterVolumeSpecName: "kube-api-access-mv6kd") pod "6c9a345f-5d20-4673-a6bb-e70e387e5d8f" (UID: "6c9a345f-5d20-4673-a6bb-e70e387e5d8f"). InnerVolumeSpecName "kube-api-access-mv6kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.661399 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-kube-api-access-szhzb" (OuterVolumeSpecName: "kube-api-access-szhzb") pod "73c319a1-ef16-4c75-b18b-0fed4fba7fb7" (UID: "73c319a1-ef16-4c75-b18b-0fed4fba7fb7"). InnerVolumeSpecName "kube-api-access-szhzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.661478 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e36eb4c-e668-403b-93d8-1940029337fc-kube-api-access-4qcsj" (OuterVolumeSpecName: "kube-api-access-4qcsj") pod "4e36eb4c-e668-403b-93d8-1940029337fc" (UID: "4e36eb4c-e668-403b-93d8-1940029337fc"). InnerVolumeSpecName "kube-api-access-4qcsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.661616 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85af42c2-35fc-4545-8fb2-22ab8beb3e22-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "85af42c2-35fc-4545-8fb2-22ab8beb3e22" (UID: "85af42c2-35fc-4545-8fb2-22ab8beb3e22"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.661821 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85af42c2-35fc-4545-8fb2-22ab8beb3e22-kube-api-access-wdnh7" (OuterVolumeSpecName: "kube-api-access-wdnh7") pod "85af42c2-35fc-4545-8fb2-22ab8beb3e22" (UID: "85af42c2-35fc-4545-8fb2-22ab8beb3e22"). InnerVolumeSpecName "kube-api-access-wdnh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.691400 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73c319a1-ef16-4c75-b18b-0fed4fba7fb7" (UID: "73c319a1-ef16-4c75-b18b-0fed4fba7fb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.693267 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e36eb4c-e668-403b-93d8-1940029337fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e36eb4c-e668-403b-93d8-1940029337fc" (UID: "4e36eb4c-e668-403b-93d8-1940029337fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.729904 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.729942 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.729951 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.729964 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szhzb\" (UniqueName: \"kubernetes.io/projected/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-kube-api-access-szhzb\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.729974 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85af42c2-35fc-4545-8fb2-22ab8beb3e22-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.729982 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qcsj\" (UniqueName: \"kubernetes.io/projected/4e36eb4c-e668-403b-93d8-1940029337fc-kube-api-access-4qcsj\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.729990 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e36eb4c-e668-403b-93d8-1940029337fc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.729998 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73c319a1-ef16-4c75-b18b-0fed4fba7fb7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.730005 4749 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/85af42c2-35fc-4545-8fb2-22ab8beb3e22-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.730014 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv6kd\" (UniqueName: \"kubernetes.io/projected/6c9a345f-5d20-4673-a6bb-e70e387e5d8f-kube-api-access-mv6kd\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.730022 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdnh7\" (UniqueName: \"kubernetes.io/projected/85af42c2-35fc-4545-8fb2-22ab8beb3e22-kube-api-access-wdnh7\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.730030 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e36eb4c-e668-403b-93d8-1940029337fc-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.830411 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-utilities\") pod \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\" (UID: \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.830536 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-catalog-content\") pod \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\" (UID: \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.830561 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h6jl\" (UniqueName: \"kubernetes.io/projected/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-kube-api-access-7h6jl\") pod \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\" (UID: \"6f96d1ba-50b5-4f78-83c6-a97f3617ac75\") " Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.831475 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-utilities" (OuterVolumeSpecName: "utilities") pod "6f96d1ba-50b5-4f78-83c6-a97f3617ac75" (UID: "6f96d1ba-50b5-4f78-83c6-a97f3617ac75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.835248 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-kube-api-access-7h6jl" (OuterVolumeSpecName: "kube-api-access-7h6jl") pod "6f96d1ba-50b5-4f78-83c6-a97f3617ac75" (UID: "6f96d1ba-50b5-4f78-83c6-a97f3617ac75"). InnerVolumeSpecName "kube-api-access-7h6jl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.931904 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.931935 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h6jl\" (UniqueName: \"kubernetes.io/projected/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-kube-api-access-7h6jl\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.943798 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f96d1ba-50b5-4f78-83c6-a97f3617ac75" (UID: "6f96d1ba-50b5-4f78-83c6-a97f3617ac75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:10:21 crc kubenswrapper[4749]: I1001 13:10:21.956301 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rc7nj"] Oct 01 13:10:21 crc kubenswrapper[4749]: W1001 13:10:21.969566 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87ce5873_e490_470b_8324_be053c551acb.slice/crio-5f505abb5e1dd459420ede5eb18d891a4938405be6a8ff6d26fc5f4cf0bdf54e WatchSource:0}: Error finding container 5f505abb5e1dd459420ede5eb18d891a4938405be6a8ff6d26fc5f4cf0bdf54e: Status 404 returned error can't find the container with id 5f505abb5e1dd459420ede5eb18d891a4938405be6a8ff6d26fc5f4cf0bdf54e Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.033031 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f96d1ba-50b5-4f78-83c6-a97f3617ac75-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.543475 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7q4w" event={"ID":"4e36eb4c-e668-403b-93d8-1940029337fc","Type":"ContainerDied","Data":"f498491b6c1aa9befad3987d37796aa07aafa378ff1de5dd6dabff8782bccab1"} Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.543732 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7q4w" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.543785 4749 scope.go:117] "RemoveContainer" containerID="b6bdf48c0652f72ae25621e8acdb358c370d10e24d94903c2562b7cfa7257a96" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.547480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2nbq9" event={"ID":"6f96d1ba-50b5-4f78-83c6-a97f3617ac75","Type":"ContainerDied","Data":"8372e4c25af46ace2499dfa54f632e5c4926e8da4b16ac697b5a101529cf5c11"} Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.547546 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2nbq9" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.551320 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzxsg" event={"ID":"6c9a345f-5d20-4673-a6bb-e70e387e5d8f","Type":"ContainerDied","Data":"6785d623bb028c5b82925049dcacbe2be55ba131de482a8ab3dff238c1ccf5c2"} Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.551472 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzxsg" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.554014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" event={"ID":"85af42c2-35fc-4545-8fb2-22ab8beb3e22","Type":"ContainerDied","Data":"b3046022cd5326043e2ec1d27a0b7081429959bc1fd0aab4bd17b5bf4925ce57"} Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.554197 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n4kx2" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.562055 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sbgjp" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.564947 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" event={"ID":"87ce5873-e490-470b-8324-be053c551acb","Type":"ContainerStarted","Data":"5899ea681f3aa2a4492937eecd9d9f46d3a336f5689ef6edfd1cc89eb6af9a53"} Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.565014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" event={"ID":"87ce5873-e490-470b-8324-be053c551acb","Type":"ContainerStarted","Data":"5f505abb5e1dd459420ede5eb18d891a4938405be6a8ff6d26fc5f4cf0bdf54e"} Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.566051 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.567567 4749 scope.go:117] "RemoveContainer" containerID="f6425c3375cc7d177b2d6900b8624543d5183252394810552037a1796db3bb4c" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.575802 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.609876 4749 scope.go:117] "RemoveContainer" containerID="7fa4e070b331a1c1f66e1919ccc5e061d638cc1f04e61aa88ffe9fdda3424da5" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.611137 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rc7nj" podStartSLOduration=1.611061809 podStartE2EDuration="1.611061809s" podCreationTimestamp="2025-10-01 13:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:10:22.595799596 +0000 UTC m=+282.649784535" watchObservedRunningTime="2025-10-01 13:10:22.611061809 +0000 UTC m=+282.665046718" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.640726 4749 scope.go:117] "RemoveContainer" containerID="c645d5be49b52d44e6d98ec705ae7dd3961d0beb996925988db9b3fcc3d79160" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.668109 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4kx2"] Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.670643 4749 scope.go:117] "RemoveContainer" containerID="96e7bae878b180e4fc1fa55288a793cc58b0fd5bea55a40a72c5fd6f1b4472c0" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.676625 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4kx2"] Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.688186 4749 scope.go:117] "RemoveContainer" containerID="79e2f2a7f272e1b309f4e200559a0173b715512f3d5a5eb9cee22c9861102741" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.688532 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l7q4w"] Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.693095 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l7q4w"] Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.696955 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2nbq9"] Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.699587 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2nbq9"] Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.702098 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzxsg"] Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.707364 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzxsg"] Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.717866 4749 scope.go:117] "RemoveContainer" containerID="cab6c44f621abdc232ef646a269307808f105a3c435f1880c327e7cf84e4f04c" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.718678 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sbgjp"] Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.723157 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sbgjp"] Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.742451 4749 scope.go:117] "RemoveContainer" containerID="a23b3f7f8ba0bd15d73f4dfc6dd13af0e81bf37dbb8309d1900967b6ac436d7b" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.761967 4749 scope.go:117] "RemoveContainer" containerID="8d7997bb1dbeec7de01b085a65e970053505dc49496a5b0dcd6cd15ee2540649" Oct 01 13:10:22 crc kubenswrapper[4749]: I1001 13:10:22.781202 4749 scope.go:117] "RemoveContainer" containerID="3bb94c31a6861b3dccacd1c1666dded2341d1921b17ee74d1c61c33914f608e6" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.240723 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e36eb4c-e668-403b-93d8-1940029337fc" path="/var/lib/kubelet/pods/4e36eb4c-e668-403b-93d8-1940029337fc/volumes" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.241329 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" path="/var/lib/kubelet/pods/6c9a345f-5d20-4673-a6bb-e70e387e5d8f/volumes" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.241900 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" path="/var/lib/kubelet/pods/6f96d1ba-50b5-4f78-83c6-a97f3617ac75/volumes" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.242945 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" path="/var/lib/kubelet/pods/73c319a1-ef16-4c75-b18b-0fed4fba7fb7/volumes" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.243674 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85af42c2-35fc-4545-8fb2-22ab8beb3e22" path="/var/lib/kubelet/pods/85af42c2-35fc-4545-8fb2-22ab8beb3e22/volumes" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.280676 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6l9vz"] Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281021 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" containerName="extract-utilities" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281045 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" containerName="extract-utilities" Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281064 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e36eb4c-e668-403b-93d8-1940029337fc" containerName="extract-utilities" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281077 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e36eb4c-e668-403b-93d8-1940029337fc" containerName="extract-utilities" Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281098 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" containerName="extract-content" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281114 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" containerName="extract-content" Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281136 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e36eb4c-e668-403b-93d8-1940029337fc" containerName="extract-content" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281149 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e36eb4c-e668-403b-93d8-1940029337fc" containerName="extract-content" Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281169 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" containerName="registry-server" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281182 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" containerName="registry-server" Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281203 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" containerName="extract-utilities" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281365 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" containerName="extract-utilities" Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281399 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" containerName="extract-content" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281413 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" containerName="extract-content" Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281431 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" containerName="registry-server" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281444 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" containerName="registry-server" Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281465 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" containerName="registry-server" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281479 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" containerName="registry-server" Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281494 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e36eb4c-e668-403b-93d8-1940029337fc" containerName="registry-server" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281507 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e36eb4c-e668-403b-93d8-1940029337fc" containerName="registry-server" Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281527 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" containerName="extract-content" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281541 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" containerName="extract-content" Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281568 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" containerName="extract-utilities" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281583 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" containerName="extract-utilities" Oct 01 13:10:23 crc kubenswrapper[4749]: E1001 13:10:23.281603 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85af42c2-35fc-4545-8fb2-22ab8beb3e22" containerName="marketplace-operator" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281616 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85af42c2-35fc-4545-8fb2-22ab8beb3e22" containerName="marketplace-operator" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281794 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f96d1ba-50b5-4f78-83c6-a97f3617ac75" containerName="registry-server" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281823 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c9a345f-5d20-4673-a6bb-e70e387e5d8f" containerName="registry-server" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281843 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c319a1-ef16-4c75-b18b-0fed4fba7fb7" containerName="registry-server" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281864 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85af42c2-35fc-4545-8fb2-22ab8beb3e22" containerName="marketplace-operator" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.281890 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e36eb4c-e668-403b-93d8-1940029337fc" containerName="registry-server" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.283493 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.289001 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.292606 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6l9vz"] Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.356000 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55f7c317-f4e5-4bf7-8245-e9f5a2291a52-catalog-content\") pod \"redhat-marketplace-6l9vz\" (UID: \"55f7c317-f4e5-4bf7-8245-e9f5a2291a52\") " pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.356114 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zxdc\" (UniqueName: \"kubernetes.io/projected/55f7c317-f4e5-4bf7-8245-e9f5a2291a52-kube-api-access-4zxdc\") pod \"redhat-marketplace-6l9vz\" (UID: \"55f7c317-f4e5-4bf7-8245-e9f5a2291a52\") " pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.356161 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55f7c317-f4e5-4bf7-8245-e9f5a2291a52-utilities\") pod \"redhat-marketplace-6l9vz\" (UID: \"55f7c317-f4e5-4bf7-8245-e9f5a2291a52\") " pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.456981 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zxdc\" (UniqueName: \"kubernetes.io/projected/55f7c317-f4e5-4bf7-8245-e9f5a2291a52-kube-api-access-4zxdc\") pod \"redhat-marketplace-6l9vz\" (UID: \"55f7c317-f4e5-4bf7-8245-e9f5a2291a52\") " pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.457684 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55f7c317-f4e5-4bf7-8245-e9f5a2291a52-utilities\") pod \"redhat-marketplace-6l9vz\" (UID: \"55f7c317-f4e5-4bf7-8245-e9f5a2291a52\") " pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.457812 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55f7c317-f4e5-4bf7-8245-e9f5a2291a52-catalog-content\") pod \"redhat-marketplace-6l9vz\" (UID: \"55f7c317-f4e5-4bf7-8245-e9f5a2291a52\") " pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.458493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55f7c317-f4e5-4bf7-8245-e9f5a2291a52-utilities\") pod \"redhat-marketplace-6l9vz\" (UID: \"55f7c317-f4e5-4bf7-8245-e9f5a2291a52\") " pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.458685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55f7c317-f4e5-4bf7-8245-e9f5a2291a52-catalog-content\") pod \"redhat-marketplace-6l9vz\" (UID: \"55f7c317-f4e5-4bf7-8245-e9f5a2291a52\") " pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.492107 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bhtpj"] Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.493846 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.499001 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.500790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zxdc\" (UniqueName: \"kubernetes.io/projected/55f7c317-f4e5-4bf7-8245-e9f5a2291a52-kube-api-access-4zxdc\") pod \"redhat-marketplace-6l9vz\" (UID: \"55f7c317-f4e5-4bf7-8245-e9f5a2291a52\") " pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.514278 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bhtpj"] Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.558977 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmgtt\" (UniqueName: \"kubernetes.io/projected/85551e82-da3b-4fc0-ad0b-39c8248062ed-kube-api-access-qmgtt\") pod \"redhat-operators-bhtpj\" (UID: \"85551e82-da3b-4fc0-ad0b-39c8248062ed\") " pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.560677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85551e82-da3b-4fc0-ad0b-39c8248062ed-utilities\") pod \"redhat-operators-bhtpj\" (UID: \"85551e82-da3b-4fc0-ad0b-39c8248062ed\") " pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.560825 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85551e82-da3b-4fc0-ad0b-39c8248062ed-catalog-content\") pod \"redhat-operators-bhtpj\" (UID: \"85551e82-da3b-4fc0-ad0b-39c8248062ed\") " pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.598890 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.663171 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85551e82-da3b-4fc0-ad0b-39c8248062ed-utilities\") pod \"redhat-operators-bhtpj\" (UID: \"85551e82-da3b-4fc0-ad0b-39c8248062ed\") " pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.663300 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85551e82-da3b-4fc0-ad0b-39c8248062ed-catalog-content\") pod \"redhat-operators-bhtpj\" (UID: \"85551e82-da3b-4fc0-ad0b-39c8248062ed\") " pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.663354 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmgtt\" (UniqueName: \"kubernetes.io/projected/85551e82-da3b-4fc0-ad0b-39c8248062ed-kube-api-access-qmgtt\") pod \"redhat-operators-bhtpj\" (UID: \"85551e82-da3b-4fc0-ad0b-39c8248062ed\") " pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.664196 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/85551e82-da3b-4fc0-ad0b-39c8248062ed-catalog-content\") pod \"redhat-operators-bhtpj\" (UID: \"85551e82-da3b-4fc0-ad0b-39c8248062ed\") " pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.664199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/85551e82-da3b-4fc0-ad0b-39c8248062ed-utilities\") pod \"redhat-operators-bhtpj\" (UID: \"85551e82-da3b-4fc0-ad0b-39c8248062ed\") " pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.682924 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmgtt\" (UniqueName: \"kubernetes.io/projected/85551e82-da3b-4fc0-ad0b-39c8248062ed-kube-api-access-qmgtt\") pod \"redhat-operators-bhtpj\" (UID: \"85551e82-da3b-4fc0-ad0b-39c8248062ed\") " pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.809574 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6l9vz"] Oct 01 13:10:23 crc kubenswrapper[4749]: I1001 13:10:23.831623 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:24 crc kubenswrapper[4749]: E1001 13:10:24.113001 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55f7c317_f4e5_4bf7_8245_e9f5a2291a52.slice/crio-5937a4498bde20452afc56f6cebfa69b93f5192c8c60343a10487ae122714abe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55f7c317_f4e5_4bf7_8245_e9f5a2291a52.slice/crio-conmon-5937a4498bde20452afc56f6cebfa69b93f5192c8c60343a10487ae122714abe.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:10:24 crc kubenswrapper[4749]: I1001 13:10:24.301674 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bhtpj"] Oct 01 13:10:24 crc kubenswrapper[4749]: W1001 13:10:24.316198 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85551e82_da3b_4fc0_ad0b_39c8248062ed.slice/crio-c7701b12206b3185c0295c01f47d875f8c6e992a22baa016117b8dbeb4fd39de WatchSource:0}: Error finding container c7701b12206b3185c0295c01f47d875f8c6e992a22baa016117b8dbeb4fd39de: Status 404 returned error can't find the container with id c7701b12206b3185c0295c01f47d875f8c6e992a22baa016117b8dbeb4fd39de Oct 01 13:10:24 crc kubenswrapper[4749]: I1001 13:10:24.586654 4749 generic.go:334] "Generic (PLEG): container finished" podID="85551e82-da3b-4fc0-ad0b-39c8248062ed" containerID="a506166de9dedbb5553fe25d4034b640626128a5810b5027ea7d2c84f076bb31" exitCode=0 Oct 01 13:10:24 crc kubenswrapper[4749]: I1001 13:10:24.586780 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhtpj" event={"ID":"85551e82-da3b-4fc0-ad0b-39c8248062ed","Type":"ContainerDied","Data":"a506166de9dedbb5553fe25d4034b640626128a5810b5027ea7d2c84f076bb31"} Oct 01 13:10:24 crc kubenswrapper[4749]: I1001 13:10:24.586830 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhtpj" event={"ID":"85551e82-da3b-4fc0-ad0b-39c8248062ed","Type":"ContainerStarted","Data":"c7701b12206b3185c0295c01f47d875f8c6e992a22baa016117b8dbeb4fd39de"} Oct 01 13:10:24 crc kubenswrapper[4749]: I1001 13:10:24.591084 4749 generic.go:334] "Generic (PLEG): container finished" podID="55f7c317-f4e5-4bf7-8245-e9f5a2291a52" containerID="5937a4498bde20452afc56f6cebfa69b93f5192c8c60343a10487ae122714abe" exitCode=0 Oct 01 13:10:24 crc kubenswrapper[4749]: I1001 13:10:24.592067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6l9vz" event={"ID":"55f7c317-f4e5-4bf7-8245-e9f5a2291a52","Type":"ContainerDied","Data":"5937a4498bde20452afc56f6cebfa69b93f5192c8c60343a10487ae122714abe"} Oct 01 13:10:24 crc kubenswrapper[4749]: I1001 13:10:24.592201 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6l9vz" event={"ID":"55f7c317-f4e5-4bf7-8245-e9f5a2291a52","Type":"ContainerStarted","Data":"fa8eb1c454fbb496095b55b5104edc852575054787ce84c71f1f28466d51552f"} Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.681582 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z2gzr"] Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.686668 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.688904 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.689901 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z2gzr"] Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.693711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6efccd9c-893a-4381-92aa-7e1e5053d7bd-catalog-content\") pod \"certified-operators-z2gzr\" (UID: \"6efccd9c-893a-4381-92aa-7e1e5053d7bd\") " pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.693746 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlltb\" (UniqueName: \"kubernetes.io/projected/6efccd9c-893a-4381-92aa-7e1e5053d7bd-kube-api-access-hlltb\") pod \"certified-operators-z2gzr\" (UID: \"6efccd9c-893a-4381-92aa-7e1e5053d7bd\") " pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.693769 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6efccd9c-893a-4381-92aa-7e1e5053d7bd-utilities\") pod \"certified-operators-z2gzr\" (UID: \"6efccd9c-893a-4381-92aa-7e1e5053d7bd\") " pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.795381 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6efccd9c-893a-4381-92aa-7e1e5053d7bd-catalog-content\") pod \"certified-operators-z2gzr\" (UID: \"6efccd9c-893a-4381-92aa-7e1e5053d7bd\") " pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.795757 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlltb\" (UniqueName: \"kubernetes.io/projected/6efccd9c-893a-4381-92aa-7e1e5053d7bd-kube-api-access-hlltb\") pod \"certified-operators-z2gzr\" (UID: \"6efccd9c-893a-4381-92aa-7e1e5053d7bd\") " pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.795787 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6efccd9c-893a-4381-92aa-7e1e5053d7bd-utilities\") pod \"certified-operators-z2gzr\" (UID: \"6efccd9c-893a-4381-92aa-7e1e5053d7bd\") " pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.795958 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6efccd9c-893a-4381-92aa-7e1e5053d7bd-catalog-content\") pod \"certified-operators-z2gzr\" (UID: \"6efccd9c-893a-4381-92aa-7e1e5053d7bd\") " pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.796281 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6efccd9c-893a-4381-92aa-7e1e5053d7bd-utilities\") pod \"certified-operators-z2gzr\" (UID: \"6efccd9c-893a-4381-92aa-7e1e5053d7bd\") " pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.822491 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlltb\" (UniqueName: \"kubernetes.io/projected/6efccd9c-893a-4381-92aa-7e1e5053d7bd-kube-api-access-hlltb\") pod \"certified-operators-z2gzr\" (UID: \"6efccd9c-893a-4381-92aa-7e1e5053d7bd\") " pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.879354 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n8fnx"] Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.883811 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n8fnx"] Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.883942 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.886364 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.903441 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqhrl\" (UniqueName: \"kubernetes.io/projected/deb1f55e-fe85-4bc7-bf9a-b2272fcfb147-kube-api-access-qqhrl\") pod \"community-operators-n8fnx\" (UID: \"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147\") " pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.903514 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb1f55e-fe85-4bc7-bf9a-b2272fcfb147-catalog-content\") pod \"community-operators-n8fnx\" (UID: \"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147\") " pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:25 crc kubenswrapper[4749]: I1001 13:10:25.903541 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb1f55e-fe85-4bc7-bf9a-b2272fcfb147-utilities\") pod \"community-operators-n8fnx\" (UID: \"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147\") " pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.004812 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqhrl\" (UniqueName: \"kubernetes.io/projected/deb1f55e-fe85-4bc7-bf9a-b2272fcfb147-kube-api-access-qqhrl\") pod \"community-operators-n8fnx\" (UID: \"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147\") " pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.004885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb1f55e-fe85-4bc7-bf9a-b2272fcfb147-catalog-content\") pod \"community-operators-n8fnx\" (UID: \"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147\") " pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.004914 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb1f55e-fe85-4bc7-bf9a-b2272fcfb147-utilities\") pod \"community-operators-n8fnx\" (UID: \"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147\") " pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.005267 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/deb1f55e-fe85-4bc7-bf9a-b2272fcfb147-utilities\") pod \"community-operators-n8fnx\" (UID: \"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147\") " pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.005412 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/deb1f55e-fe85-4bc7-bf9a-b2272fcfb147-catalog-content\") pod \"community-operators-n8fnx\" (UID: \"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147\") " pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.014686 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.023073 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqhrl\" (UniqueName: \"kubernetes.io/projected/deb1f55e-fe85-4bc7-bf9a-b2272fcfb147-kube-api-access-qqhrl\") pod \"community-operators-n8fnx\" (UID: \"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147\") " pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.213764 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z2gzr"] Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.224573 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.607410 4749 generic.go:334] "Generic (PLEG): container finished" podID="55f7c317-f4e5-4bf7-8245-e9f5a2291a52" containerID="c8be49a67a0270b7bc6f2675a404363b1e5bf870e638ae72dc117ea4971cb18e" exitCode=0 Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.607760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6l9vz" event={"ID":"55f7c317-f4e5-4bf7-8245-e9f5a2291a52","Type":"ContainerDied","Data":"c8be49a67a0270b7bc6f2675a404363b1e5bf870e638ae72dc117ea4971cb18e"} Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.612455 4749 generic.go:334] "Generic (PLEG): container finished" podID="85551e82-da3b-4fc0-ad0b-39c8248062ed" containerID="6aff549a313cb6fa73e80e93e30f0848a8b962a5e57cc5d8c9c75520dc064237" exitCode=0 Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.612552 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhtpj" event={"ID":"85551e82-da3b-4fc0-ad0b-39c8248062ed","Type":"ContainerDied","Data":"6aff549a313cb6fa73e80e93e30f0848a8b962a5e57cc5d8c9c75520dc064237"} Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.615905 4749 generic.go:334] "Generic (PLEG): container finished" podID="6efccd9c-893a-4381-92aa-7e1e5053d7bd" containerID="4b0fccd3a4074d8bf952237dfa6643206c954d0001ff0558176c8c0557644d33" exitCode=0 Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.615930 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z2gzr" event={"ID":"6efccd9c-893a-4381-92aa-7e1e5053d7bd","Type":"ContainerDied","Data":"4b0fccd3a4074d8bf952237dfa6643206c954d0001ff0558176c8c0557644d33"} Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.615945 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z2gzr" event={"ID":"6efccd9c-893a-4381-92aa-7e1e5053d7bd","Type":"ContainerStarted","Data":"9188082932a1c9ba7fc03a808b1c742984512a9042e436655be44034753a0e9a"} Oct 01 13:10:26 crc kubenswrapper[4749]: I1001 13:10:26.663356 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n8fnx"] Oct 01 13:10:27 crc kubenswrapper[4749]: I1001 13:10:27.622710 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6l9vz" event={"ID":"55f7c317-f4e5-4bf7-8245-e9f5a2291a52","Type":"ContainerStarted","Data":"7968172ad149aa4916ebba2acbb2e30c28a02762d383057f4991a299bc07f0d0"} Oct 01 13:10:27 crc kubenswrapper[4749]: I1001 13:10:27.623846 4749 generic.go:334] "Generic (PLEG): container finished" podID="deb1f55e-fe85-4bc7-bf9a-b2272fcfb147" containerID="137504ba599e249997a54720f75ca3fc7431107565e4acda1d1180b3876f427d" exitCode=0 Oct 01 13:10:27 crc kubenswrapper[4749]: I1001 13:10:27.623913 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n8fnx" event={"ID":"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147","Type":"ContainerDied","Data":"137504ba599e249997a54720f75ca3fc7431107565e4acda1d1180b3876f427d"} Oct 01 13:10:27 crc kubenswrapper[4749]: I1001 13:10:27.623937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n8fnx" event={"ID":"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147","Type":"ContainerStarted","Data":"69f9b79d1cf6dbd4fe53a0e191ccd24322c6880b827b64a296b0901e37fe6a21"} Oct 01 13:10:27 crc kubenswrapper[4749]: I1001 13:10:27.626641 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhtpj" event={"ID":"85551e82-da3b-4fc0-ad0b-39c8248062ed","Type":"ContainerStarted","Data":"ac14f3c9ec21e5b00e74ebe97e61ad5d64816dd0a01b987cb7d445dd5f2194b2"} Oct 01 13:10:27 crc kubenswrapper[4749]: I1001 13:10:27.628193 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z2gzr" event={"ID":"6efccd9c-893a-4381-92aa-7e1e5053d7bd","Type":"ContainerStarted","Data":"fd7ac0ba544f7887d6f1020f5271907e45c414f230262b5274de7c3657160fd8"} Oct 01 13:10:27 crc kubenswrapper[4749]: I1001 13:10:27.647917 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6l9vz" podStartSLOduration=2.154796437 podStartE2EDuration="4.647894095s" podCreationTimestamp="2025-10-01 13:10:23 +0000 UTC" firstStartedPulling="2025-10-01 13:10:24.593868245 +0000 UTC m=+284.647853164" lastFinishedPulling="2025-10-01 13:10:27.086965913 +0000 UTC m=+287.140950822" observedRunningTime="2025-10-01 13:10:27.647586096 +0000 UTC m=+287.701571005" watchObservedRunningTime="2025-10-01 13:10:27.647894095 +0000 UTC m=+287.701879004" Oct 01 13:10:27 crc kubenswrapper[4749]: I1001 13:10:27.687711 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bhtpj" podStartSLOduration=2.201993297 podStartE2EDuration="4.68769187s" podCreationTimestamp="2025-10-01 13:10:23 +0000 UTC" firstStartedPulling="2025-10-01 13:10:24.588985943 +0000 UTC m=+284.642970892" lastFinishedPulling="2025-10-01 13:10:27.074684546 +0000 UTC m=+287.128669465" observedRunningTime="2025-10-01 13:10:27.687158865 +0000 UTC m=+287.741143774" watchObservedRunningTime="2025-10-01 13:10:27.68769187 +0000 UTC m=+287.741676769" Oct 01 13:10:28 crc kubenswrapper[4749]: I1001 13:10:28.637726 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n8fnx" event={"ID":"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147","Type":"ContainerStarted","Data":"560c19327e2abfc609a14cf13cb1d1b6816943da58e16ac543c245073b2df662"} Oct 01 13:10:28 crc kubenswrapper[4749]: I1001 13:10:28.642826 4749 generic.go:334] "Generic (PLEG): container finished" podID="6efccd9c-893a-4381-92aa-7e1e5053d7bd" containerID="fd7ac0ba544f7887d6f1020f5271907e45c414f230262b5274de7c3657160fd8" exitCode=0 Oct 01 13:10:28 crc kubenswrapper[4749]: I1001 13:10:28.643409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z2gzr" event={"ID":"6efccd9c-893a-4381-92aa-7e1e5053d7bd","Type":"ContainerDied","Data":"fd7ac0ba544f7887d6f1020f5271907e45c414f230262b5274de7c3657160fd8"} Oct 01 13:10:29 crc kubenswrapper[4749]: I1001 13:10:29.653185 4749 generic.go:334] "Generic (PLEG): container finished" podID="deb1f55e-fe85-4bc7-bf9a-b2272fcfb147" containerID="560c19327e2abfc609a14cf13cb1d1b6816943da58e16ac543c245073b2df662" exitCode=0 Oct 01 13:10:29 crc kubenswrapper[4749]: I1001 13:10:29.653580 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n8fnx" event={"ID":"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147","Type":"ContainerDied","Data":"560c19327e2abfc609a14cf13cb1d1b6816943da58e16ac543c245073b2df662"} Oct 01 13:10:30 crc kubenswrapper[4749]: I1001 13:10:30.660756 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n8fnx" event={"ID":"deb1f55e-fe85-4bc7-bf9a-b2272fcfb147","Type":"ContainerStarted","Data":"e981e6a0af7709c73ab01a12747aadfc599973da4375e5fd95993e013b2d7970"} Oct 01 13:10:30 crc kubenswrapper[4749]: I1001 13:10:30.663013 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z2gzr" event={"ID":"6efccd9c-893a-4381-92aa-7e1e5053d7bd","Type":"ContainerStarted","Data":"14e001847b1fbdc5f241d4acc7c7a043032d203f90578c53db4a82de9f4aaca2"} Oct 01 13:10:30 crc kubenswrapper[4749]: I1001 13:10:30.680236 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n8fnx" podStartSLOduration=3.662888257 podStartE2EDuration="5.680202375s" podCreationTimestamp="2025-10-01 13:10:25 +0000 UTC" firstStartedPulling="2025-10-01 13:10:27.625093493 +0000 UTC m=+287.679078392" lastFinishedPulling="2025-10-01 13:10:29.642407611 +0000 UTC m=+289.696392510" observedRunningTime="2025-10-01 13:10:30.677907279 +0000 UTC m=+290.731892178" watchObservedRunningTime="2025-10-01 13:10:30.680202375 +0000 UTC m=+290.734187294" Oct 01 13:10:30 crc kubenswrapper[4749]: I1001 13:10:30.698345 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z2gzr" podStartSLOduration=2.674563358 podStartE2EDuration="5.698322641s" podCreationTimestamp="2025-10-01 13:10:25 +0000 UTC" firstStartedPulling="2025-10-01 13:10:26.618852664 +0000 UTC m=+286.672837603" lastFinishedPulling="2025-10-01 13:10:29.642611987 +0000 UTC m=+289.696596886" observedRunningTime="2025-10-01 13:10:30.694944263 +0000 UTC m=+290.748929172" watchObservedRunningTime="2025-10-01 13:10:30.698322641 +0000 UTC m=+290.752307540" Oct 01 13:10:33 crc kubenswrapper[4749]: I1001 13:10:33.600474 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:33 crc kubenswrapper[4749]: I1001 13:10:33.600872 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:33 crc kubenswrapper[4749]: I1001 13:10:33.649765 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:33 crc kubenswrapper[4749]: I1001 13:10:33.723043 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6l9vz" Oct 01 13:10:33 crc kubenswrapper[4749]: I1001 13:10:33.834046 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:33 crc kubenswrapper[4749]: I1001 13:10:33.834708 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:33 crc kubenswrapper[4749]: I1001 13:10:33.873178 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:34 crc kubenswrapper[4749]: I1001 13:10:34.735879 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bhtpj" Oct 01 13:10:36 crc kubenswrapper[4749]: I1001 13:10:36.015817 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:36 crc kubenswrapper[4749]: I1001 13:10:36.015884 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:36 crc kubenswrapper[4749]: I1001 13:10:36.071181 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:10:36 crc kubenswrapper[4749]: I1001 13:10:36.225396 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:36 crc kubenswrapper[4749]: I1001 13:10:36.225466 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:36 crc kubenswrapper[4749]: I1001 13:10:36.290384 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:36 crc kubenswrapper[4749]: I1001 13:10:36.748512 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n8fnx" Oct 01 13:10:36 crc kubenswrapper[4749]: I1001 13:10:36.764828 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z2gzr" Oct 01 13:11:32 crc kubenswrapper[4749]: I1001 13:11:32.107552 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:11:32 crc kubenswrapper[4749]: I1001 13:11:32.108294 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:12:02 crc kubenswrapper[4749]: I1001 13:12:02.106554 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:12:02 crc kubenswrapper[4749]: I1001 13:12:02.107631 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.790043 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nb9kf"] Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.791727 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.846947 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-bound-sa-token\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.847004 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.847024 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-registry-certificates\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.847051 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-registry-tls\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.847068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-trusted-ca\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.847090 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll6j7\" (UniqueName: \"kubernetes.io/projected/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-kube-api-access-ll6j7\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.847107 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.847124 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.855085 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nb9kf"] Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.876709 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.947863 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-registry-certificates\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.947927 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-registry-tls\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.947947 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-trusted-ca\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.947967 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6j7\" (UniqueName: \"kubernetes.io/projected/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-kube-api-access-ll6j7\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.947984 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.948002 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.948036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-bound-sa-token\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.949092 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.949765 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-registry-certificates\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.950735 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-trusted-ca\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.955126 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.963792 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-bound-sa-token\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.968637 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll6j7\" (UniqueName: \"kubernetes.io/projected/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-kube-api-access-ll6j7\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:23 crc kubenswrapper[4749]: I1001 13:12:23.976642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/318c7cdd-e7da-4532-8087-bfa53bc3d3f7-registry-tls\") pod \"image-registry-66df7c8f76-nb9kf\" (UID: \"318c7cdd-e7da-4532-8087-bfa53bc3d3f7\") " pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:24 crc kubenswrapper[4749]: I1001 13:12:24.111957 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:24 crc kubenswrapper[4749]: I1001 13:12:24.347803 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nb9kf"] Oct 01 13:12:24 crc kubenswrapper[4749]: W1001 13:12:24.359306 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod318c7cdd_e7da_4532_8087_bfa53bc3d3f7.slice/crio-8c0d8609e2e3113360b20c49badb62bc28c94054d7d5b54b3e11ba468b4157bf WatchSource:0}: Error finding container 8c0d8609e2e3113360b20c49badb62bc28c94054d7d5b54b3e11ba468b4157bf: Status 404 returned error can't find the container with id 8c0d8609e2e3113360b20c49badb62bc28c94054d7d5b54b3e11ba468b4157bf Oct 01 13:12:24 crc kubenswrapper[4749]: I1001 13:12:24.395155 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" event={"ID":"318c7cdd-e7da-4532-8087-bfa53bc3d3f7","Type":"ContainerStarted","Data":"8c0d8609e2e3113360b20c49badb62bc28c94054d7d5b54b3e11ba468b4157bf"} Oct 01 13:12:25 crc kubenswrapper[4749]: I1001 13:12:25.402692 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" event={"ID":"318c7cdd-e7da-4532-8087-bfa53bc3d3f7","Type":"ContainerStarted","Data":"a5ffd8ce8df44828f266286c2229864e63971f8445ad2bd5f39e84c3cd9ed8b6"} Oct 01 13:12:25 crc kubenswrapper[4749]: I1001 13:12:25.403089 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:25 crc kubenswrapper[4749]: I1001 13:12:25.444161 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" podStartSLOduration=2.444133706 podStartE2EDuration="2.444133706s" podCreationTimestamp="2025-10-01 13:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:12:25.435566715 +0000 UTC m=+405.489551684" watchObservedRunningTime="2025-10-01 13:12:25.444133706 +0000 UTC m=+405.498118645" Oct 01 13:12:32 crc kubenswrapper[4749]: I1001 13:12:32.106639 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:12:32 crc kubenswrapper[4749]: I1001 13:12:32.107418 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:12:32 crc kubenswrapper[4749]: I1001 13:12:32.107472 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:12:32 crc kubenswrapper[4749]: I1001 13:12:32.108118 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a452f1a75cf05bf9b08e26d6af338a726d9bc5128fc085f221e68420c78c125"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:12:32 crc kubenswrapper[4749]: I1001 13:12:32.108200 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://6a452f1a75cf05bf9b08e26d6af338a726d9bc5128fc085f221e68420c78c125" gracePeriod=600 Oct 01 13:12:32 crc kubenswrapper[4749]: I1001 13:12:32.456436 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="6a452f1a75cf05bf9b08e26d6af338a726d9bc5128fc085f221e68420c78c125" exitCode=0 Oct 01 13:12:32 crc kubenswrapper[4749]: I1001 13:12:32.456502 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"6a452f1a75cf05bf9b08e26d6af338a726d9bc5128fc085f221e68420c78c125"} Oct 01 13:12:32 crc kubenswrapper[4749]: I1001 13:12:32.456827 4749 scope.go:117] "RemoveContainer" containerID="bcaec2673385eeadafa81bec95812e145369e61564ac2cb1a8884400af2ac1c5" Oct 01 13:12:33 crc kubenswrapper[4749]: I1001 13:12:33.465807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"d37b6e61f0b3af0ecef6f3f6b8e1af5456836ce443313f97dfd8a281c3e0b927"} Oct 01 13:12:44 crc kubenswrapper[4749]: I1001 13:12:44.121331 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-nb9kf" Oct 01 13:12:44 crc kubenswrapper[4749]: I1001 13:12:44.201085 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqndp"] Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.254185 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" podUID="8ba15709-180a-4045-9d19-df6de2d8cf6e" containerName="registry" containerID="cri-o://11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2" gracePeriod=30 Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.643492 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.750480 4749 generic.go:334] "Generic (PLEG): container finished" podID="8ba15709-180a-4045-9d19-df6de2d8cf6e" containerID="11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2" exitCode=0 Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.750541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" event={"ID":"8ba15709-180a-4045-9d19-df6de2d8cf6e","Type":"ContainerDied","Data":"11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2"} Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.750586 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.750612 4749 scope.go:117] "RemoveContainer" containerID="11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.750591 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqndp" event={"ID":"8ba15709-180a-4045-9d19-df6de2d8cf6e","Type":"ContainerDied","Data":"fc4bcd2ed4a514970d710f3e4a639cdab2a9bf06b7bda3e9ad2adbcf5589e5b4"} Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.777850 4749 scope.go:117] "RemoveContainer" containerID="11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2" Oct 01 13:13:09 crc kubenswrapper[4749]: E1001 13:13:09.778468 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2\": container with ID starting with 11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2 not found: ID does not exist" containerID="11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.778534 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2"} err="failed to get container status \"11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2\": rpc error: code = NotFound desc = could not find container \"11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2\": container with ID starting with 11a9ffc143b6ebaf6cd0341f03a6efabcf86ee15a9973138a6a9f636bc7402c2 not found: ID does not exist" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.803964 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-bound-sa-token\") pod \"8ba15709-180a-4045-9d19-df6de2d8cf6e\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.804047 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ba15709-180a-4045-9d19-df6de2d8cf6e-registry-certificates\") pod \"8ba15709-180a-4045-9d19-df6de2d8cf6e\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.804191 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8ba15709-180a-4045-9d19-df6de2d8cf6e\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.804270 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ba15709-180a-4045-9d19-df6de2d8cf6e-installation-pull-secrets\") pod \"8ba15709-180a-4045-9d19-df6de2d8cf6e\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.804332 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4cw7\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-kube-api-access-w4cw7\") pod \"8ba15709-180a-4045-9d19-df6de2d8cf6e\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.804616 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ba15709-180a-4045-9d19-df6de2d8cf6e-ca-trust-extracted\") pod \"8ba15709-180a-4045-9d19-df6de2d8cf6e\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.804656 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-registry-tls\") pod \"8ba15709-180a-4045-9d19-df6de2d8cf6e\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.804692 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ba15709-180a-4045-9d19-df6de2d8cf6e-trusted-ca\") pod \"8ba15709-180a-4045-9d19-df6de2d8cf6e\" (UID: \"8ba15709-180a-4045-9d19-df6de2d8cf6e\") " Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.806482 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ba15709-180a-4045-9d19-df6de2d8cf6e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8ba15709-180a-4045-9d19-df6de2d8cf6e" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.806749 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ba15709-180a-4045-9d19-df6de2d8cf6e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8ba15709-180a-4045-9d19-df6de2d8cf6e" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.815586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-kube-api-access-w4cw7" (OuterVolumeSpecName: "kube-api-access-w4cw7") pod "8ba15709-180a-4045-9d19-df6de2d8cf6e" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e"). InnerVolumeSpecName "kube-api-access-w4cw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.817177 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ba15709-180a-4045-9d19-df6de2d8cf6e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8ba15709-180a-4045-9d19-df6de2d8cf6e" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.817382 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8ba15709-180a-4045-9d19-df6de2d8cf6e" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.817460 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8ba15709-180a-4045-9d19-df6de2d8cf6e" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.826786 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8ba15709-180a-4045-9d19-df6de2d8cf6e" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.833201 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba15709-180a-4045-9d19-df6de2d8cf6e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8ba15709-180a-4045-9d19-df6de2d8cf6e" (UID: "8ba15709-180a-4045-9d19-df6de2d8cf6e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.906041 4749 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8ba15709-180a-4045-9d19-df6de2d8cf6e-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.906072 4749 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.906080 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ba15709-180a-4045-9d19-df6de2d8cf6e-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.906088 4749 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.906096 4749 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8ba15709-180a-4045-9d19-df6de2d8cf6e-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.906108 4749 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8ba15709-180a-4045-9d19-df6de2d8cf6e-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:09 crc kubenswrapper[4749]: I1001 13:13:09.906118 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4cw7\" (UniqueName: \"kubernetes.io/projected/8ba15709-180a-4045-9d19-df6de2d8cf6e-kube-api-access-w4cw7\") on node \"crc\" DevicePath \"\"" Oct 01 13:13:10 crc kubenswrapper[4749]: I1001 13:13:10.112823 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqndp"] Oct 01 13:13:10 crc kubenswrapper[4749]: I1001 13:13:10.121120 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqndp"] Oct 01 13:13:11 crc kubenswrapper[4749]: I1001 13:13:11.243279 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba15709-180a-4045-9d19-df6de2d8cf6e" path="/var/lib/kubelet/pods/8ba15709-180a-4045-9d19-df6de2d8cf6e/volumes" Oct 01 13:14:32 crc kubenswrapper[4749]: I1001 13:14:32.107207 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:14:32 crc kubenswrapper[4749]: I1001 13:14:32.107885 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:14:41 crc kubenswrapper[4749]: I1001 13:14:41.362348 4749 scope.go:117] "RemoveContainer" containerID="2088cbd4f3e5eb5bbe03bfbcd887a867a7616a94c41f160ce7e9ee422ed52ceb" Oct 01 13:14:41 crc kubenswrapper[4749]: I1001 13:14:41.399823 4749 scope.go:117] "RemoveContainer" containerID="c406e2b954937a18038f8c54a4fa5b8295df59cb695bb529b5dfdcdc21f24939" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.143680 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8"] Oct 01 13:15:00 crc kubenswrapper[4749]: E1001 13:15:00.144705 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba15709-180a-4045-9d19-df6de2d8cf6e" containerName="registry" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.144729 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba15709-180a-4045-9d19-df6de2d8cf6e" containerName="registry" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.144927 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba15709-180a-4045-9d19-df6de2d8cf6e" containerName="registry" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.145680 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.148875 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.152303 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8"] Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.189590 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.325994 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shvwg\" (UniqueName: \"kubernetes.io/projected/3f3049be-5196-40de-8ff1-1895937e9510-kube-api-access-shvwg\") pod \"collect-profiles-29322075-nrdq8\" (UID: \"3f3049be-5196-40de-8ff1-1895937e9510\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.326099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f3049be-5196-40de-8ff1-1895937e9510-secret-volume\") pod \"collect-profiles-29322075-nrdq8\" (UID: \"3f3049be-5196-40de-8ff1-1895937e9510\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.326147 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f3049be-5196-40de-8ff1-1895937e9510-config-volume\") pod \"collect-profiles-29322075-nrdq8\" (UID: \"3f3049be-5196-40de-8ff1-1895937e9510\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.427095 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shvwg\" (UniqueName: \"kubernetes.io/projected/3f3049be-5196-40de-8ff1-1895937e9510-kube-api-access-shvwg\") pod \"collect-profiles-29322075-nrdq8\" (UID: \"3f3049be-5196-40de-8ff1-1895937e9510\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.427188 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f3049be-5196-40de-8ff1-1895937e9510-secret-volume\") pod \"collect-profiles-29322075-nrdq8\" (UID: \"3f3049be-5196-40de-8ff1-1895937e9510\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.427263 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f3049be-5196-40de-8ff1-1895937e9510-config-volume\") pod \"collect-profiles-29322075-nrdq8\" (UID: \"3f3049be-5196-40de-8ff1-1895937e9510\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.428707 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f3049be-5196-40de-8ff1-1895937e9510-config-volume\") pod \"collect-profiles-29322075-nrdq8\" (UID: \"3f3049be-5196-40de-8ff1-1895937e9510\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.437513 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f3049be-5196-40de-8ff1-1895937e9510-secret-volume\") pod \"collect-profiles-29322075-nrdq8\" (UID: \"3f3049be-5196-40de-8ff1-1895937e9510\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.458316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shvwg\" (UniqueName: \"kubernetes.io/projected/3f3049be-5196-40de-8ff1-1895937e9510-kube-api-access-shvwg\") pod \"collect-profiles-29322075-nrdq8\" (UID: \"3f3049be-5196-40de-8ff1-1895937e9510\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.506870 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:00 crc kubenswrapper[4749]: I1001 13:15:00.739626 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8"] Oct 01 13:15:01 crc kubenswrapper[4749]: I1001 13:15:01.545209 4749 generic.go:334] "Generic (PLEG): container finished" podID="3f3049be-5196-40de-8ff1-1895937e9510" containerID="7bd4bbb0493c1232b12bf40d1307beb514880c0a626d7b82b2ce817ad73344e6" exitCode=0 Oct 01 13:15:01 crc kubenswrapper[4749]: I1001 13:15:01.545380 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" event={"ID":"3f3049be-5196-40de-8ff1-1895937e9510","Type":"ContainerDied","Data":"7bd4bbb0493c1232b12bf40d1307beb514880c0a626d7b82b2ce817ad73344e6"} Oct 01 13:15:01 crc kubenswrapper[4749]: I1001 13:15:01.545600 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" event={"ID":"3f3049be-5196-40de-8ff1-1895937e9510","Type":"ContainerStarted","Data":"b8063bf6997bd5e3867f3322345b01cb866acf752d7baaa2f3058e7a47d5fe0f"} Oct 01 13:15:02 crc kubenswrapper[4749]: I1001 13:15:02.106564 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:15:02 crc kubenswrapper[4749]: I1001 13:15:02.106710 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:15:02 crc kubenswrapper[4749]: I1001 13:15:02.885442 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:03 crc kubenswrapper[4749]: I1001 13:15:03.063355 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f3049be-5196-40de-8ff1-1895937e9510-config-volume\") pod \"3f3049be-5196-40de-8ff1-1895937e9510\" (UID: \"3f3049be-5196-40de-8ff1-1895937e9510\") " Oct 01 13:15:03 crc kubenswrapper[4749]: I1001 13:15:03.063445 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shvwg\" (UniqueName: \"kubernetes.io/projected/3f3049be-5196-40de-8ff1-1895937e9510-kube-api-access-shvwg\") pod \"3f3049be-5196-40de-8ff1-1895937e9510\" (UID: \"3f3049be-5196-40de-8ff1-1895937e9510\") " Oct 01 13:15:03 crc kubenswrapper[4749]: I1001 13:15:03.063525 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f3049be-5196-40de-8ff1-1895937e9510-secret-volume\") pod \"3f3049be-5196-40de-8ff1-1895937e9510\" (UID: \"3f3049be-5196-40de-8ff1-1895937e9510\") " Oct 01 13:15:03 crc kubenswrapper[4749]: I1001 13:15:03.064349 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f3049be-5196-40de-8ff1-1895937e9510-config-volume" (OuterVolumeSpecName: "config-volume") pod "3f3049be-5196-40de-8ff1-1895937e9510" (UID: "3f3049be-5196-40de-8ff1-1895937e9510"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:15:03 crc kubenswrapper[4749]: I1001 13:15:03.069745 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3049be-5196-40de-8ff1-1895937e9510-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3f3049be-5196-40de-8ff1-1895937e9510" (UID: "3f3049be-5196-40de-8ff1-1895937e9510"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:15:03 crc kubenswrapper[4749]: I1001 13:15:03.071383 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3049be-5196-40de-8ff1-1895937e9510-kube-api-access-shvwg" (OuterVolumeSpecName: "kube-api-access-shvwg") pod "3f3049be-5196-40de-8ff1-1895937e9510" (UID: "3f3049be-5196-40de-8ff1-1895937e9510"). InnerVolumeSpecName "kube-api-access-shvwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:15:03 crc kubenswrapper[4749]: I1001 13:15:03.165108 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f3049be-5196-40de-8ff1-1895937e9510-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:03 crc kubenswrapper[4749]: I1001 13:15:03.165156 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shvwg\" (UniqueName: \"kubernetes.io/projected/3f3049be-5196-40de-8ff1-1895937e9510-kube-api-access-shvwg\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:03 crc kubenswrapper[4749]: I1001 13:15:03.165178 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3f3049be-5196-40de-8ff1-1895937e9510-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:15:03 crc kubenswrapper[4749]: I1001 13:15:03.560795 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" event={"ID":"3f3049be-5196-40de-8ff1-1895937e9510","Type":"ContainerDied","Data":"b8063bf6997bd5e3867f3322345b01cb866acf752d7baaa2f3058e7a47d5fe0f"} Oct 01 13:15:03 crc kubenswrapper[4749]: I1001 13:15:03.560840 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8063bf6997bd5e3867f3322345b01cb866acf752d7baaa2f3058e7a47d5fe0f" Oct 01 13:15:03 crc kubenswrapper[4749]: I1001 13:15:03.561284 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8" Oct 01 13:15:32 crc kubenswrapper[4749]: I1001 13:15:32.106324 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:15:32 crc kubenswrapper[4749]: I1001 13:15:32.107054 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:15:32 crc kubenswrapper[4749]: I1001 13:15:32.107113 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:15:32 crc kubenswrapper[4749]: I1001 13:15:32.107813 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d37b6e61f0b3af0ecef6f3f6b8e1af5456836ce443313f97dfd8a281c3e0b927"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:15:32 crc kubenswrapper[4749]: I1001 13:15:32.107882 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://d37b6e61f0b3af0ecef6f3f6b8e1af5456836ce443313f97dfd8a281c3e0b927" gracePeriod=600 Oct 01 13:15:32 crc kubenswrapper[4749]: I1001 13:15:32.756861 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="d37b6e61f0b3af0ecef6f3f6b8e1af5456836ce443313f97dfd8a281c3e0b927" exitCode=0 Oct 01 13:15:32 crc kubenswrapper[4749]: I1001 13:15:32.756997 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"d37b6e61f0b3af0ecef6f3f6b8e1af5456836ce443313f97dfd8a281c3e0b927"} Oct 01 13:15:32 crc kubenswrapper[4749]: I1001 13:15:32.757381 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"bf9f42cda1c41e2e8bbd8a36b3bc094b1e73bf93791da0833fab72799f05a62e"} Oct 01 13:15:32 crc kubenswrapper[4749]: I1001 13:15:32.757405 4749 scope.go:117] "RemoveContainer" containerID="6a452f1a75cf05bf9b08e26d6af338a726d9bc5128fc085f221e68420c78c125" Oct 01 13:15:41 crc kubenswrapper[4749]: I1001 13:15:41.438325 4749 scope.go:117] "RemoveContainer" containerID="111f0327dbab7bceb217b086703cc87be7d7c86c2a3d75f93f0f9947d373acfc" Oct 01 13:15:41 crc kubenswrapper[4749]: I1001 13:15:41.474733 4749 scope.go:117] "RemoveContainer" containerID="e0acadafdd62c886e4f3bf16a58302fd4766c44d11d17cc8c9bf9463ac6d5be7" Oct 01 13:15:41 crc kubenswrapper[4749]: I1001 13:15:41.493338 4749 scope.go:117] "RemoveContainer" containerID="16d29e8cf9197d3c440c0b38e0c080a6241330272b5d19710236c59c1801de62" Oct 01 13:15:41 crc kubenswrapper[4749]: I1001 13:15:41.509602 4749 scope.go:117] "RemoveContainer" containerID="d62af885e013b5bd5478dbda934e83a694e6db14fc0a4a9817e22276445b7cf6" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.481672 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-8k42j"] Oct 01 13:16:35 crc kubenswrapper[4749]: E1001 13:16:35.482412 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3049be-5196-40de-8ff1-1895937e9510" containerName="collect-profiles" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.482429 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3049be-5196-40de-8ff1-1895937e9510" containerName="collect-profiles" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.482568 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f3049be-5196-40de-8ff1-1895937e9510" containerName="collect-profiles" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.482975 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-8k42j" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.484782 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-47dgh" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.485361 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.485581 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.494059 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4mvl7"] Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.494797 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4mvl7" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.500315 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-42l8t" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.506079 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-8k42j"] Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.526335 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dbrwz"] Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.527063 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-dbrwz" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.533480 4749 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-9tz4s" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.548259 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dbrwz"] Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.569500 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4mvl7"] Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.664914 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhbhn\" (UniqueName: \"kubernetes.io/projected/ab88e6b2-d588-4c1d-8946-89b5fe7c47f1-kube-api-access-hhbhn\") pod \"cert-manager-webhook-5655c58dd6-dbrwz\" (UID: \"ab88e6b2-d588-4c1d-8946-89b5fe7c47f1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dbrwz" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.665134 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx2pz\" (UniqueName: \"kubernetes.io/projected/58b791af-6670-4e2f-8cd7-a55793e8d9ba-kube-api-access-fx2pz\") pod \"cert-manager-cainjector-7f985d654d-8k42j\" (UID: \"58b791af-6670-4e2f-8cd7-a55793e8d9ba\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-8k42j" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.665211 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66d97\" (UniqueName: \"kubernetes.io/projected/851207b0-7920-45e6-b27b-aeda659789b7-kube-api-access-66d97\") pod \"cert-manager-5b446d88c5-4mvl7\" (UID: \"851207b0-7920-45e6-b27b-aeda659789b7\") " pod="cert-manager/cert-manager-5b446d88c5-4mvl7" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.766668 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx2pz\" (UniqueName: \"kubernetes.io/projected/58b791af-6670-4e2f-8cd7-a55793e8d9ba-kube-api-access-fx2pz\") pod \"cert-manager-cainjector-7f985d654d-8k42j\" (UID: \"58b791af-6670-4e2f-8cd7-a55793e8d9ba\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-8k42j" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.766747 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66d97\" (UniqueName: \"kubernetes.io/projected/851207b0-7920-45e6-b27b-aeda659789b7-kube-api-access-66d97\") pod \"cert-manager-5b446d88c5-4mvl7\" (UID: \"851207b0-7920-45e6-b27b-aeda659789b7\") " pod="cert-manager/cert-manager-5b446d88c5-4mvl7" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.766798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhbhn\" (UniqueName: \"kubernetes.io/projected/ab88e6b2-d588-4c1d-8946-89b5fe7c47f1-kube-api-access-hhbhn\") pod \"cert-manager-webhook-5655c58dd6-dbrwz\" (UID: \"ab88e6b2-d588-4c1d-8946-89b5fe7c47f1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dbrwz" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.795157 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhbhn\" (UniqueName: \"kubernetes.io/projected/ab88e6b2-d588-4c1d-8946-89b5fe7c47f1-kube-api-access-hhbhn\") pod \"cert-manager-webhook-5655c58dd6-dbrwz\" (UID: \"ab88e6b2-d588-4c1d-8946-89b5fe7c47f1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-dbrwz" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.795185 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx2pz\" (UniqueName: \"kubernetes.io/projected/58b791af-6670-4e2f-8cd7-a55793e8d9ba-kube-api-access-fx2pz\") pod \"cert-manager-cainjector-7f985d654d-8k42j\" (UID: \"58b791af-6670-4e2f-8cd7-a55793e8d9ba\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-8k42j" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.796468 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66d97\" (UniqueName: \"kubernetes.io/projected/851207b0-7920-45e6-b27b-aeda659789b7-kube-api-access-66d97\") pod \"cert-manager-5b446d88c5-4mvl7\" (UID: \"851207b0-7920-45e6-b27b-aeda659789b7\") " pod="cert-manager/cert-manager-5b446d88c5-4mvl7" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.801086 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-8k42j" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.821119 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-4mvl7" Oct 01 13:16:35 crc kubenswrapper[4749]: I1001 13:16:35.850100 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-dbrwz" Oct 01 13:16:36 crc kubenswrapper[4749]: I1001 13:16:36.137598 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-dbrwz"] Oct 01 13:16:36 crc kubenswrapper[4749]: I1001 13:16:36.146789 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:16:36 crc kubenswrapper[4749]: I1001 13:16:36.169498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-dbrwz" event={"ID":"ab88e6b2-d588-4c1d-8946-89b5fe7c47f1","Type":"ContainerStarted","Data":"0b3946256e55c56d01a029f83b1033551dba2b43caf43246e48874f04eedae98"} Oct 01 13:16:36 crc kubenswrapper[4749]: I1001 13:16:36.287818 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-8k42j"] Oct 01 13:16:36 crc kubenswrapper[4749]: W1001 13:16:36.289830 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58b791af_6670_4e2f_8cd7_a55793e8d9ba.slice/crio-375440c0d7cd872e964de8d0f78f2e22b6a4675aa617816a17d56c8e0a83fa07 WatchSource:0}: Error finding container 375440c0d7cd872e964de8d0f78f2e22b6a4675aa617816a17d56c8e0a83fa07: Status 404 returned error can't find the container with id 375440c0d7cd872e964de8d0f78f2e22b6a4675aa617816a17d56c8e0a83fa07 Oct 01 13:16:36 crc kubenswrapper[4749]: I1001 13:16:36.304163 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-4mvl7"] Oct 01 13:16:36 crc kubenswrapper[4749]: W1001 13:16:36.309992 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod851207b0_7920_45e6_b27b_aeda659789b7.slice/crio-860ab1da44924e9d7f8c38f1d72880a56a186d4161acd94c0d4d257003ff97fa WatchSource:0}: Error finding container 860ab1da44924e9d7f8c38f1d72880a56a186d4161acd94c0d4d257003ff97fa: Status 404 returned error can't find the container with id 860ab1da44924e9d7f8c38f1d72880a56a186d4161acd94c0d4d257003ff97fa Oct 01 13:16:37 crc kubenswrapper[4749]: I1001 13:16:37.177927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4mvl7" event={"ID":"851207b0-7920-45e6-b27b-aeda659789b7","Type":"ContainerStarted","Data":"860ab1da44924e9d7f8c38f1d72880a56a186d4161acd94c0d4d257003ff97fa"} Oct 01 13:16:37 crc kubenswrapper[4749]: I1001 13:16:37.179822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-8k42j" event={"ID":"58b791af-6670-4e2f-8cd7-a55793e8d9ba","Type":"ContainerStarted","Data":"375440c0d7cd872e964de8d0f78f2e22b6a4675aa617816a17d56c8e0a83fa07"} Oct 01 13:16:48 crc kubenswrapper[4749]: I1001 13:16:48.276356 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-8k42j" event={"ID":"58b791af-6670-4e2f-8cd7-a55793e8d9ba","Type":"ContainerStarted","Data":"3db90eda6439c72db8064fdcfd1bdedf8bea122892eee8140c2ae794710aa7cc"} Oct 01 13:16:48 crc kubenswrapper[4749]: I1001 13:16:48.293782 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-8k42j" podStartSLOduration=3.082097457 podStartE2EDuration="13.293765234s" podCreationTimestamp="2025-10-01 13:16:35 +0000 UTC" firstStartedPulling="2025-10-01 13:16:36.294341704 +0000 UTC m=+656.348326603" lastFinishedPulling="2025-10-01 13:16:46.506009451 +0000 UTC m=+666.559994380" observedRunningTime="2025-10-01 13:16:48.293032264 +0000 UTC m=+668.347017173" watchObservedRunningTime="2025-10-01 13:16:48.293765234 +0000 UTC m=+668.347750143" Oct 01 13:16:49 crc kubenswrapper[4749]: I1001 13:16:49.284506 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-dbrwz" event={"ID":"ab88e6b2-d588-4c1d-8946-89b5fe7c47f1","Type":"ContainerStarted","Data":"a5a931b4d1f34ac765fc287c750fcb38db8260e19e276f6d096247f638aee96c"} Oct 01 13:16:49 crc kubenswrapper[4749]: I1001 13:16:49.306854 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-dbrwz" podStartSLOduration=1.960167492 podStartE2EDuration="14.306835804s" podCreationTimestamp="2025-10-01 13:16:35 +0000 UTC" firstStartedPulling="2025-10-01 13:16:36.146574932 +0000 UTC m=+656.200559821" lastFinishedPulling="2025-10-01 13:16:48.493243234 +0000 UTC m=+668.547228133" observedRunningTime="2025-10-01 13:16:49.30268144 +0000 UTC m=+669.356666379" watchObservedRunningTime="2025-10-01 13:16:49.306835804 +0000 UTC m=+669.360820703" Oct 01 13:16:50 crc kubenswrapper[4749]: I1001 13:16:50.298421 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-dbrwz" Oct 01 13:16:52 crc kubenswrapper[4749]: I1001 13:16:52.313190 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-4mvl7" event={"ID":"851207b0-7920-45e6-b27b-aeda659789b7","Type":"ContainerStarted","Data":"3184faef05be41b57d1b361af70c7067b568cf6dda6a439bb25f08db57ef14c5"} Oct 01 13:16:52 crc kubenswrapper[4749]: I1001 13:16:52.336351 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-4mvl7" podStartSLOduration=2.261401342 podStartE2EDuration="17.336327566s" podCreationTimestamp="2025-10-01 13:16:35 +0000 UTC" firstStartedPulling="2025-10-01 13:16:36.312499872 +0000 UTC m=+656.366484791" lastFinishedPulling="2025-10-01 13:16:51.387426086 +0000 UTC m=+671.441411015" observedRunningTime="2025-10-01 13:16:52.332918603 +0000 UTC m=+672.386903582" watchObservedRunningTime="2025-10-01 13:16:52.336327566 +0000 UTC m=+672.390312505" Oct 01 13:16:55 crc kubenswrapper[4749]: I1001 13:16:55.856679 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-dbrwz" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.513209 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fgjjp"] Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.514916 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovn-controller" containerID="cri-o://ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4" gracePeriod=30 Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.514969 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="nbdb" containerID="cri-o://19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b" gracePeriod=30 Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.515098 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="northd" containerID="cri-o://e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81" gracePeriod=30 Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.515184 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="sbdb" containerID="cri-o://58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9" gracePeriod=30 Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.515211 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="kube-rbac-proxy-node" containerID="cri-o://eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2" gracePeriod=30 Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.515190 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e" gracePeriod=30 Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.515320 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovn-acl-logging" containerID="cri-o://318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43" gracePeriod=30 Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.556905 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" containerID="cri-o://0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5" gracePeriod=30 Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.900940 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/3.log" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.903852 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovn-acl-logging/0.log" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.904606 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovn-controller/0.log" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.905482 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.969622 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-t7d6l"] Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.970182 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.970203 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.970218 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovn-acl-logging" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.970227 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovn-acl-logging" Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.970257 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovn-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.970268 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovn-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.970540 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.970555 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.970568 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="kube-rbac-proxy-node" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.970607 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="kube-rbac-proxy-node" Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.970621 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.970629 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.970641 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="nbdb" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.970650 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="nbdb" Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.970666 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="sbdb" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.970674 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="sbdb" Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.970723 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.970732 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.970767 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="northd" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.971616 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="northd" Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.971954 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="kubecfg-setup" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.971966 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="kubecfg-setup" Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.971978 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.971986 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972163 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovn-acl-logging" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972183 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972193 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972203 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="northd" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972211 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972226 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="sbdb" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972259 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovn-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972271 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972281 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="kube-rbac-proxy-node" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972293 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972302 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="nbdb" Oct 01 13:17:05 crc kubenswrapper[4749]: E1001 13:17:05.972413 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972422 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.972582 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerName="ovnkube-controller" Oct 01 13:17:05 crc kubenswrapper[4749]: I1001 13:17:05.974919 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.068994 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpcbh\" (UniqueName: \"kubernetes.io/projected/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-kube-api-access-lpcbh\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069033 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovn-node-metrics-cert\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069057 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-run-netns\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069077 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-openvswitch\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069116 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-env-overrides\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069166 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-var-lib-openvswitch\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069183 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-log-socket\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069211 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-node-log\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069238 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovnkube-config\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069198 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069255 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-systemd\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069353 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovnkube-script-lib\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069398 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-kubelet\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069437 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-systemd-units\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069471 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069544 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-ovn\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069590 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-cni-bin\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069621 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-run-ovn-kubernetes\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069650 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-cni-netd\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069699 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-slash\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069743 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-etc-openvswitch\") pod \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\" (UID: \"f6a678bb-9e17-4b2a-bef9-dea34bc3c218\") " Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070060 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.069727 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070037 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-log-socket" (OuterVolumeSpecName: "log-socket") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070013 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070042 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070137 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070174 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070206 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070261 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070283 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070298 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070314 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-slash" (OuterVolumeSpecName: "host-slash") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070330 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070340 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-node-log" (OuterVolumeSpecName: "node-log") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070387 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.070967 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.075402 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-kube-api-access-lpcbh" (OuterVolumeSpecName: "kube-api-access-lpcbh") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "kube-api-access-lpcbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.077645 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.091267 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f6a678bb-9e17-4b2a-bef9-dea34bc3c218" (UID: "f6a678bb-9e17-4b2a-bef9-dea34bc3c218"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.171520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.171618 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-run-netns\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.171674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b1a05f3-24fd-463b-9d86-1318b170e569-ovnkube-config\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.171820 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-systemd-units\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172009 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-kubelet\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-cni-bin\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172138 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b1a05f3-24fd-463b-9d86-1318b170e569-ovn-node-metrics-cert\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172275 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9qc8\" (UniqueName: \"kubernetes.io/projected/3b1a05f3-24fd-463b-9d86-1318b170e569-kube-api-access-k9qc8\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172335 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-run-systemd\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172381 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b1a05f3-24fd-463b-9d86-1318b170e569-ovnkube-script-lib\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172431 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-node-log\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172486 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-run-openvswitch\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172537 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-cni-netd\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172625 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-run-ovn-kubernetes\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172715 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-log-socket\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-slash\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172818 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b1a05f3-24fd-463b-9d86-1318b170e569-env-overrides\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172864 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-var-lib-openvswitch\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.172931 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-run-ovn\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173008 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-etc-openvswitch\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173105 4749 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173135 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173165 4749 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173187 4749 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173210 4749 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173272 4749 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173292 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173309 4749 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173326 4749 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173344 4749 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-host-slash\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173360 4749 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173376 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpcbh\" (UniqueName: \"kubernetes.io/projected/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-kube-api-access-lpcbh\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173392 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173408 4749 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173426 4749 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173444 4749 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173460 4749 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-log-socket\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173476 4749 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-node-log\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.173492 4749 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6a678bb-9e17-4b2a-bef9-dea34bc3c218-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275285 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-etc-openvswitch\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275356 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275396 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-run-netns\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275431 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b1a05f3-24fd-463b-9d86-1318b170e569-ovnkube-config\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275466 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-systemd-units\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275524 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-kubelet\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-cni-bin\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275589 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b1a05f3-24fd-463b-9d86-1318b170e569-ovn-node-metrics-cert\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275625 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9qc8\" (UniqueName: \"kubernetes.io/projected/3b1a05f3-24fd-463b-9d86-1318b170e569-kube-api-access-k9qc8\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275659 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-run-systemd\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275690 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b1a05f3-24fd-463b-9d86-1318b170e569-ovnkube-script-lib\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275719 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-node-log\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275756 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-run-openvswitch\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275795 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-run-ovn-kubernetes\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275835 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-cni-netd\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-log-socket\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-slash\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.275971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b1a05f3-24fd-463b-9d86-1318b170e569-env-overrides\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.276017 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-var-lib-openvswitch\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.276065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-run-ovn\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.276202 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-run-ovn\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.276316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-etc-openvswitch\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.276375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.276428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-run-netns\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.276848 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-node-log\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.276993 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-systemd-units\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.277066 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-kubelet\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.277139 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-cni-bin\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.277441 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-run-systemd\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.277609 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-log-socket\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.277804 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-cni-netd\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.277686 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-var-lib-openvswitch\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.277755 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-run-openvswitch\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.277756 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-slash\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.277829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3b1a05f3-24fd-463b-9d86-1318b170e569-host-run-ovn-kubernetes\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.278453 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3b1a05f3-24fd-463b-9d86-1318b170e569-env-overrides\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.278715 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3b1a05f3-24fd-463b-9d86-1318b170e569-ovnkube-config\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.279533 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3b1a05f3-24fd-463b-9d86-1318b170e569-ovnkube-script-lib\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.286549 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3b1a05f3-24fd-463b-9d86-1318b170e569-ovn-node-metrics-cert\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.314376 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9qc8\" (UniqueName: \"kubernetes.io/projected/3b1a05f3-24fd-463b-9d86-1318b170e569-kube-api-access-k9qc8\") pod \"ovnkube-node-t7d6l\" (UID: \"3b1a05f3-24fd-463b-9d86-1318b170e569\") " pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.414371 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrgp7_33919a9e-1f0d-4127-915d-17d77d78853e/kube-multus/2.log" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.415418 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrgp7_33919a9e-1f0d-4127-915d-17d77d78853e/kube-multus/1.log" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.415498 4749 generic.go:334] "Generic (PLEG): container finished" podID="33919a9e-1f0d-4127-915d-17d77d78853e" containerID="dafe8b3793369956dfb95caebe1bb42e3642c1ecebf130cd340a4642e7927aa7" exitCode=2 Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.415592 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrgp7" event={"ID":"33919a9e-1f0d-4127-915d-17d77d78853e","Type":"ContainerDied","Data":"dafe8b3793369956dfb95caebe1bb42e3642c1ecebf130cd340a4642e7927aa7"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.415658 4749 scope.go:117] "RemoveContainer" containerID="b38a7f70ee60a91fce57dde930befc389fc6cda29f06b5461084fb700a449da6" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.416386 4749 scope.go:117] "RemoveContainer" containerID="dafe8b3793369956dfb95caebe1bb42e3642c1ecebf130cd340a4642e7927aa7" Oct 01 13:17:06 crc kubenswrapper[4749]: E1001 13:17:06.416772 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-nrgp7_openshift-multus(33919a9e-1f0d-4127-915d-17d77d78853e)\"" pod="openshift-multus/multus-nrgp7" podUID="33919a9e-1f0d-4127-915d-17d77d78853e" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.421753 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovnkube-controller/3.log" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.426699 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovn-acl-logging/0.log" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.427611 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fgjjp_f6a678bb-9e17-4b2a-bef9-dea34bc3c218/ovn-controller/0.log" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428560 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5" exitCode=0 Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428612 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9" exitCode=0 Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428636 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b" exitCode=0 Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428658 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81" exitCode=0 Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428676 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e" exitCode=0 Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428701 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2" exitCode=0 Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428720 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43" exitCode=143 Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428758 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" containerID="ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4" exitCode=143 Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428854 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428881 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428681 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428971 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.428998 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429022 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429038 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429053 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429068 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429083 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429097 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429111 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429124 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429138 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429159 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429183 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429200 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429214 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429266 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429283 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429297 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429311 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429326 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429340 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429354 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429376 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429414 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429435 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429452 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429466 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429481 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429496 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429512 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429529 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429544 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429560 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429585 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fgjjp" event={"ID":"f6a678bb-9e17-4b2a-bef9-dea34bc3c218","Type":"ContainerDied","Data":"5439fc0322b5913fb92e4f8e8001dd36a895646ea758ee48478f6442c95c800c"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429610 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429629 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429646 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429662 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429679 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429695 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429710 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429726 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429742 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.429758 4749 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127"} Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.470839 4749 scope.go:117] "RemoveContainer" containerID="0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.502362 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fgjjp"] Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.506602 4749 scope.go:117] "RemoveContainer" containerID="f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.514420 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fgjjp"] Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.539673 4749 scope.go:117] "RemoveContainer" containerID="58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.555548 4749 scope.go:117] "RemoveContainer" containerID="19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.572031 4749 scope.go:117] "RemoveContainer" containerID="e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.585195 4749 scope.go:117] "RemoveContainer" containerID="e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.589297 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.606751 4749 scope.go:117] "RemoveContainer" containerID="eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2" Oct 01 13:17:06 crc kubenswrapper[4749]: W1001 13:17:06.615820 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b1a05f3_24fd_463b_9d86_1318b170e569.slice/crio-03c75b5e9a9be2fc015c41ec99a846712a4b98222db9aa77265ae1cd51c647ce WatchSource:0}: Error finding container 03c75b5e9a9be2fc015c41ec99a846712a4b98222db9aa77265ae1cd51c647ce: Status 404 returned error can't find the container with id 03c75b5e9a9be2fc015c41ec99a846712a4b98222db9aa77265ae1cd51c647ce Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.626526 4749 scope.go:117] "RemoveContainer" containerID="318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.663384 4749 scope.go:117] "RemoveContainer" containerID="ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.699255 4749 scope.go:117] "RemoveContainer" containerID="c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.728760 4749 scope.go:117] "RemoveContainer" containerID="0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5" Oct 01 13:17:06 crc kubenswrapper[4749]: E1001 13:17:06.731744 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5\": container with ID starting with 0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5 not found: ID does not exist" containerID="0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.731795 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5"} err="failed to get container status \"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5\": rpc error: code = NotFound desc = could not find container \"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5\": container with ID starting with 0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.731822 4749 scope.go:117] "RemoveContainer" containerID="f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743" Oct 01 13:17:06 crc kubenswrapper[4749]: E1001 13:17:06.733257 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\": container with ID starting with f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743 not found: ID does not exist" containerID="f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.733303 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743"} err="failed to get container status \"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\": rpc error: code = NotFound desc = could not find container \"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\": container with ID starting with f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.733329 4749 scope.go:117] "RemoveContainer" containerID="58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9" Oct 01 13:17:06 crc kubenswrapper[4749]: E1001 13:17:06.734016 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\": container with ID starting with 58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9 not found: ID does not exist" containerID="58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.734077 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9"} err="failed to get container status \"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\": rpc error: code = NotFound desc = could not find container \"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\": container with ID starting with 58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.734173 4749 scope.go:117] "RemoveContainer" containerID="19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b" Oct 01 13:17:06 crc kubenswrapper[4749]: E1001 13:17:06.734744 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\": container with ID starting with 19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b not found: ID does not exist" containerID="19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.734775 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b"} err="failed to get container status \"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\": rpc error: code = NotFound desc = could not find container \"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\": container with ID starting with 19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.734795 4749 scope.go:117] "RemoveContainer" containerID="e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81" Oct 01 13:17:06 crc kubenswrapper[4749]: E1001 13:17:06.735147 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\": container with ID starting with e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81 not found: ID does not exist" containerID="e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.735199 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81"} err="failed to get container status \"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\": rpc error: code = NotFound desc = could not find container \"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\": container with ID starting with e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.735261 4749 scope.go:117] "RemoveContainer" containerID="e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e" Oct 01 13:17:06 crc kubenswrapper[4749]: E1001 13:17:06.735752 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\": container with ID starting with e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e not found: ID does not exist" containerID="e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.735783 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e"} err="failed to get container status \"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\": rpc error: code = NotFound desc = could not find container \"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\": container with ID starting with e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.735802 4749 scope.go:117] "RemoveContainer" containerID="eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2" Oct 01 13:17:06 crc kubenswrapper[4749]: E1001 13:17:06.736256 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\": container with ID starting with eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2 not found: ID does not exist" containerID="eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.736285 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2"} err="failed to get container status \"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\": rpc error: code = NotFound desc = could not find container \"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\": container with ID starting with eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.736301 4749 scope.go:117] "RemoveContainer" containerID="318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43" Oct 01 13:17:06 crc kubenswrapper[4749]: E1001 13:17:06.736588 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\": container with ID starting with 318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43 not found: ID does not exist" containerID="318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.736611 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43"} err="failed to get container status \"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\": rpc error: code = NotFound desc = could not find container \"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\": container with ID starting with 318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.736628 4749 scope.go:117] "RemoveContainer" containerID="ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4" Oct 01 13:17:06 crc kubenswrapper[4749]: E1001 13:17:06.736905 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\": container with ID starting with ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4 not found: ID does not exist" containerID="ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.736931 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4"} err="failed to get container status \"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\": rpc error: code = NotFound desc = could not find container \"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\": container with ID starting with ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.736951 4749 scope.go:117] "RemoveContainer" containerID="c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127" Oct 01 13:17:06 crc kubenswrapper[4749]: E1001 13:17:06.737352 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\": container with ID starting with c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127 not found: ID does not exist" containerID="c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.737381 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127"} err="failed to get container status \"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\": rpc error: code = NotFound desc = could not find container \"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\": container with ID starting with c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.737400 4749 scope.go:117] "RemoveContainer" containerID="0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.737827 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5"} err="failed to get container status \"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5\": rpc error: code = NotFound desc = could not find container \"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5\": container with ID starting with 0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.737850 4749 scope.go:117] "RemoveContainer" containerID="f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.738204 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743"} err="failed to get container status \"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\": rpc error: code = NotFound desc = could not find container \"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\": container with ID starting with f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.738251 4749 scope.go:117] "RemoveContainer" containerID="58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.738724 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9"} err="failed to get container status \"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\": rpc error: code = NotFound desc = could not find container \"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\": container with ID starting with 58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.738745 4749 scope.go:117] "RemoveContainer" containerID="19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.739154 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b"} err="failed to get container status \"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\": rpc error: code = NotFound desc = could not find container \"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\": container with ID starting with 19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.739181 4749 scope.go:117] "RemoveContainer" containerID="e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.739597 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81"} err="failed to get container status \"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\": rpc error: code = NotFound desc = could not find container \"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\": container with ID starting with e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.739623 4749 scope.go:117] "RemoveContainer" containerID="e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.739842 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e"} err="failed to get container status \"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\": rpc error: code = NotFound desc = could not find container \"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\": container with ID starting with e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.739862 4749 scope.go:117] "RemoveContainer" containerID="eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.740083 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2"} err="failed to get container status \"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\": rpc error: code = NotFound desc = could not find container \"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\": container with ID starting with eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.740109 4749 scope.go:117] "RemoveContainer" containerID="318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.740368 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43"} err="failed to get container status \"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\": rpc error: code = NotFound desc = could not find container \"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\": container with ID starting with 318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.740390 4749 scope.go:117] "RemoveContainer" containerID="ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.740631 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4"} err="failed to get container status \"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\": rpc error: code = NotFound desc = could not find container \"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\": container with ID starting with ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.740647 4749 scope.go:117] "RemoveContainer" containerID="c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.740888 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127"} err="failed to get container status \"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\": rpc error: code = NotFound desc = could not find container \"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\": container with ID starting with c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.740902 4749 scope.go:117] "RemoveContainer" containerID="0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.741316 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5"} err="failed to get container status \"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5\": rpc error: code = NotFound desc = could not find container \"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5\": container with ID starting with 0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.741349 4749 scope.go:117] "RemoveContainer" containerID="f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.741701 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743"} err="failed to get container status \"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\": rpc error: code = NotFound desc = could not find container \"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\": container with ID starting with f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.741729 4749 scope.go:117] "RemoveContainer" containerID="58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.742139 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9"} err="failed to get container status \"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\": rpc error: code = NotFound desc = could not find container \"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\": container with ID starting with 58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.742163 4749 scope.go:117] "RemoveContainer" containerID="19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.742433 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b"} err="failed to get container status \"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\": rpc error: code = NotFound desc = could not find container \"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\": container with ID starting with 19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.742456 4749 scope.go:117] "RemoveContainer" containerID="e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.742766 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81"} err="failed to get container status \"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\": rpc error: code = NotFound desc = could not find container \"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\": container with ID starting with e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.742788 4749 scope.go:117] "RemoveContainer" containerID="e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.743123 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e"} err="failed to get container status \"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\": rpc error: code = NotFound desc = could not find container \"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\": container with ID starting with e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.743147 4749 scope.go:117] "RemoveContainer" containerID="eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.743574 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2"} err="failed to get container status \"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\": rpc error: code = NotFound desc = could not find container \"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\": container with ID starting with eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.743629 4749 scope.go:117] "RemoveContainer" containerID="318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.744012 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43"} err="failed to get container status \"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\": rpc error: code = NotFound desc = could not find container \"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\": container with ID starting with 318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.744035 4749 scope.go:117] "RemoveContainer" containerID="ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.744408 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4"} err="failed to get container status \"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\": rpc error: code = NotFound desc = could not find container \"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\": container with ID starting with ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.744458 4749 scope.go:117] "RemoveContainer" containerID="c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.744965 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127"} err="failed to get container status \"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\": rpc error: code = NotFound desc = could not find container \"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\": container with ID starting with c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.744991 4749 scope.go:117] "RemoveContainer" containerID="0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.745407 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5"} err="failed to get container status \"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5\": rpc error: code = NotFound desc = could not find container \"0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5\": container with ID starting with 0192593cd95bec20ebd9c54331638ece4d5b39045162050b55da98843ee68ec5 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.745441 4749 scope.go:117] "RemoveContainer" containerID="f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.745699 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743"} err="failed to get container status \"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\": rpc error: code = NotFound desc = could not find container \"f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743\": container with ID starting with f8aeac0e35e7ebe5eb09907bf4e04efdc7ef37998dd06cee5afb588dec21e743 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.745725 4749 scope.go:117] "RemoveContainer" containerID="58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.746066 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9"} err="failed to get container status \"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\": rpc error: code = NotFound desc = could not find container \"58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9\": container with ID starting with 58ed9f2b917d8fd8f0415f03cf6f9b25cb86268208f0fcd0ab9942df8e8423a9 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.746121 4749 scope.go:117] "RemoveContainer" containerID="19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.746568 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b"} err="failed to get container status \"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\": rpc error: code = NotFound desc = could not find container \"19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b\": container with ID starting with 19f7341a9421f13c133eba7e8b7b989304c42faa12dabe9d664d91f32989dd2b not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.746604 4749 scope.go:117] "RemoveContainer" containerID="e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.746905 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81"} err="failed to get container status \"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\": rpc error: code = NotFound desc = could not find container \"e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81\": container with ID starting with e6ff68f4c10d7229aca5224e1271d3de7874422de728acc472297abbf3a48b81 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.746937 4749 scope.go:117] "RemoveContainer" containerID="e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.747268 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e"} err="failed to get container status \"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\": rpc error: code = NotFound desc = could not find container \"e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e\": container with ID starting with e73ace573cc0c4d918779f3e3807e1b362422af5750086b280ff41d79866ba5e not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.747299 4749 scope.go:117] "RemoveContainer" containerID="eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.747547 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2"} err="failed to get container status \"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\": rpc error: code = NotFound desc = could not find container \"eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2\": container with ID starting with eb1b7ac2ead12dea303fe69f3af2d84b1698c54ce690f3d42fc1c5129c9dc6a2 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.747577 4749 scope.go:117] "RemoveContainer" containerID="318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.747794 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43"} err="failed to get container status \"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\": rpc error: code = NotFound desc = could not find container \"318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43\": container with ID starting with 318cbae46e32390a1ff55ee1a9740ebea368541babdd74ee15aa50e5b1fb0b43 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.747827 4749 scope.go:117] "RemoveContainer" containerID="ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.748053 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4"} err="failed to get container status \"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\": rpc error: code = NotFound desc = could not find container \"ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4\": container with ID starting with ea86601535c0a4c400d31f09524bd216904350a945a8337d16ea6bc736d6b4a4 not found: ID does not exist" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.748083 4749 scope.go:117] "RemoveContainer" containerID="c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127" Oct 01 13:17:06 crc kubenswrapper[4749]: I1001 13:17:06.748481 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127"} err="failed to get container status \"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\": rpc error: code = NotFound desc = could not find container \"c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127\": container with ID starting with c8289feb99840295c4b1d7367153a78237b3aff30dfbe701572e3a4aeba35127 not found: ID does not exist" Oct 01 13:17:07 crc kubenswrapper[4749]: I1001 13:17:07.243355 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6a678bb-9e17-4b2a-bef9-dea34bc3c218" path="/var/lib/kubelet/pods/f6a678bb-9e17-4b2a-bef9-dea34bc3c218/volumes" Oct 01 13:17:07 crc kubenswrapper[4749]: I1001 13:17:07.440656 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrgp7_33919a9e-1f0d-4127-915d-17d77d78853e/kube-multus/2.log" Oct 01 13:17:07 crc kubenswrapper[4749]: I1001 13:17:07.445596 4749 generic.go:334] "Generic (PLEG): container finished" podID="3b1a05f3-24fd-463b-9d86-1318b170e569" containerID="db8a6a668fcf7161b5e9d93b0d0562b8303c45b3b9cd23b2f50c3624f07841ff" exitCode=0 Oct 01 13:17:07 crc kubenswrapper[4749]: I1001 13:17:07.445657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" event={"ID":"3b1a05f3-24fd-463b-9d86-1318b170e569","Type":"ContainerDied","Data":"db8a6a668fcf7161b5e9d93b0d0562b8303c45b3b9cd23b2f50c3624f07841ff"} Oct 01 13:17:07 crc kubenswrapper[4749]: I1001 13:17:07.445695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" event={"ID":"3b1a05f3-24fd-463b-9d86-1318b170e569","Type":"ContainerStarted","Data":"03c75b5e9a9be2fc015c41ec99a846712a4b98222db9aa77265ae1cd51c647ce"} Oct 01 13:17:08 crc kubenswrapper[4749]: I1001 13:17:08.457198 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" event={"ID":"3b1a05f3-24fd-463b-9d86-1318b170e569","Type":"ContainerStarted","Data":"56e09cf3b424116bc36a33fc0931008b55378f029470a1d648a3149dd5ee9c04"} Oct 01 13:17:08 crc kubenswrapper[4749]: I1001 13:17:08.457308 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" event={"ID":"3b1a05f3-24fd-463b-9d86-1318b170e569","Type":"ContainerStarted","Data":"93ed947d9d9cf915e044ff66b541bb0cb0d3306b5f8e8cbef6610879c51a1000"} Oct 01 13:17:08 crc kubenswrapper[4749]: I1001 13:17:08.457337 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" event={"ID":"3b1a05f3-24fd-463b-9d86-1318b170e569","Type":"ContainerStarted","Data":"bc4ed08f0c50edc16b9912a098fbc24f9700e32a4ebcdf8dc32f4653839db1fd"} Oct 01 13:17:09 crc kubenswrapper[4749]: I1001 13:17:09.468670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" event={"ID":"3b1a05f3-24fd-463b-9d86-1318b170e569","Type":"ContainerStarted","Data":"76c41ccb671f0d38282fa145e9bccd194e304dccbdde8df19c95eeacd8395e4c"} Oct 01 13:17:09 crc kubenswrapper[4749]: I1001 13:17:09.469122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" event={"ID":"3b1a05f3-24fd-463b-9d86-1318b170e569","Type":"ContainerStarted","Data":"b601c95c445ac24f34953e9ba134d12bb3de59446500addcbf141e2a3dfc5c6d"} Oct 01 13:17:09 crc kubenswrapper[4749]: I1001 13:17:09.469147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" event={"ID":"3b1a05f3-24fd-463b-9d86-1318b170e569","Type":"ContainerStarted","Data":"8435683462fdd1df982c5ad59b240e45acc9996974058c4e34ca3016d093efd9"} Oct 01 13:17:11 crc kubenswrapper[4749]: I1001 13:17:11.485232 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" event={"ID":"3b1a05f3-24fd-463b-9d86-1318b170e569","Type":"ContainerStarted","Data":"8d9d6ba1e5dcfc074b069714e7c4f17ea081fcd1f30db102131ba8edb966fb6b"} Oct 01 13:17:14 crc kubenswrapper[4749]: I1001 13:17:14.518561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" event={"ID":"3b1a05f3-24fd-463b-9d86-1318b170e569","Type":"ContainerStarted","Data":"981a23262449aefd0966d6b97e75478635a10907bb0271037a64ca1c8f79a8e4"} Oct 01 13:17:14 crc kubenswrapper[4749]: I1001 13:17:14.519272 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:14 crc kubenswrapper[4749]: I1001 13:17:14.519292 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:14 crc kubenswrapper[4749]: I1001 13:17:14.519314 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:14 crc kubenswrapper[4749]: I1001 13:17:14.559570 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:14 crc kubenswrapper[4749]: I1001 13:17:14.564451 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:14 crc kubenswrapper[4749]: I1001 13:17:14.572707 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" podStartSLOduration=9.572686795 podStartE2EDuration="9.572686795s" podCreationTimestamp="2025-10-01 13:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:17:14.568767818 +0000 UTC m=+694.622752717" watchObservedRunningTime="2025-10-01 13:17:14.572686795 +0000 UTC m=+694.626671734" Oct 01 13:17:21 crc kubenswrapper[4749]: I1001 13:17:21.235457 4749 scope.go:117] "RemoveContainer" containerID="dafe8b3793369956dfb95caebe1bb42e3642c1ecebf130cd340a4642e7927aa7" Oct 01 13:17:21 crc kubenswrapper[4749]: E1001 13:17:21.236453 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-nrgp7_openshift-multus(33919a9e-1f0d-4127-915d-17d77d78853e)\"" pod="openshift-multus/multus-nrgp7" podUID="33919a9e-1f0d-4127-915d-17d77d78853e" Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.584825 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c"] Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.586892 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.590129 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.603140 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c"] Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.770455 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2629ec5a-9fe4-4220-812b-d5c9597e5363-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c\" (UID: \"2629ec5a-9fe4-4220-812b-d5c9597e5363\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.770755 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2629ec5a-9fe4-4220-812b-d5c9597e5363-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c\" (UID: \"2629ec5a-9fe4-4220-812b-d5c9597e5363\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.770819 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mdxc\" (UniqueName: \"kubernetes.io/projected/2629ec5a-9fe4-4220-812b-d5c9597e5363-kube-api-access-5mdxc\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c\" (UID: \"2629ec5a-9fe4-4220-812b-d5c9597e5363\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.872389 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2629ec5a-9fe4-4220-812b-d5c9597e5363-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c\" (UID: \"2629ec5a-9fe4-4220-812b-d5c9597e5363\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.872937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2629ec5a-9fe4-4220-812b-d5c9597e5363-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c\" (UID: \"2629ec5a-9fe4-4220-812b-d5c9597e5363\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.872973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mdxc\" (UniqueName: \"kubernetes.io/projected/2629ec5a-9fe4-4220-812b-d5c9597e5363-kube-api-access-5mdxc\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c\" (UID: \"2629ec5a-9fe4-4220-812b-d5c9597e5363\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.873210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2629ec5a-9fe4-4220-812b-d5c9597e5363-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c\" (UID: \"2629ec5a-9fe4-4220-812b-d5c9597e5363\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.873623 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2629ec5a-9fe4-4220-812b-d5c9597e5363-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c\" (UID: \"2629ec5a-9fe4-4220-812b-d5c9597e5363\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.910956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mdxc\" (UniqueName: \"kubernetes.io/projected/2629ec5a-9fe4-4220-812b-d5c9597e5363-kube-api-access-5mdxc\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c\" (UID: \"2629ec5a-9fe4-4220-812b-d5c9597e5363\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: I1001 13:17:26.924210 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: E1001 13:17:26.967321 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_openshift-marketplace_2629ec5a-9fe4-4220-812b-d5c9597e5363_0(f407b0177ffa14b937acf2c8cf2dad4e83cc9c2f164f38af39d7e1a02cfd16c7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 13:17:26 crc kubenswrapper[4749]: E1001 13:17:26.967417 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_openshift-marketplace_2629ec5a-9fe4-4220-812b-d5c9597e5363_0(f407b0177ffa14b937acf2c8cf2dad4e83cc9c2f164f38af39d7e1a02cfd16c7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: E1001 13:17:26.967456 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_openshift-marketplace_2629ec5a-9fe4-4220-812b-d5c9597e5363_0(f407b0177ffa14b937acf2c8cf2dad4e83cc9c2f164f38af39d7e1a02cfd16c7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:26 crc kubenswrapper[4749]: E1001 13:17:26.967529 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_openshift-marketplace(2629ec5a-9fe4-4220-812b-d5c9597e5363)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_openshift-marketplace(2629ec5a-9fe4-4220-812b-d5c9597e5363)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_openshift-marketplace_2629ec5a-9fe4-4220-812b-d5c9597e5363_0(f407b0177ffa14b937acf2c8cf2dad4e83cc9c2f164f38af39d7e1a02cfd16c7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" podUID="2629ec5a-9fe4-4220-812b-d5c9597e5363" Oct 01 13:17:27 crc kubenswrapper[4749]: I1001 13:17:27.613862 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:27 crc kubenswrapper[4749]: I1001 13:17:27.614684 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:27 crc kubenswrapper[4749]: E1001 13:17:27.647909 4749 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_openshift-marketplace_2629ec5a-9fe4-4220-812b-d5c9597e5363_0(ec1c54d36d17cd87e7b10d1d6ee591918045e5ff4cbb1cfaacac16dd85bbeea1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 13:17:27 crc kubenswrapper[4749]: E1001 13:17:27.648052 4749 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_openshift-marketplace_2629ec5a-9fe4-4220-812b-d5c9597e5363_0(ec1c54d36d17cd87e7b10d1d6ee591918045e5ff4cbb1cfaacac16dd85bbeea1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:27 crc kubenswrapper[4749]: E1001 13:17:27.648106 4749 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_openshift-marketplace_2629ec5a-9fe4-4220-812b-d5c9597e5363_0(ec1c54d36d17cd87e7b10d1d6ee591918045e5ff4cbb1cfaacac16dd85bbeea1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:27 crc kubenswrapper[4749]: E1001 13:17:27.648197 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_openshift-marketplace(2629ec5a-9fe4-4220-812b-d5c9597e5363)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_openshift-marketplace(2629ec5a-9fe4-4220-812b-d5c9597e5363)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_openshift-marketplace_2629ec5a-9fe4-4220-812b-d5c9597e5363_0(ec1c54d36d17cd87e7b10d1d6ee591918045e5ff4cbb1cfaacac16dd85bbeea1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" podUID="2629ec5a-9fe4-4220-812b-d5c9597e5363" Oct 01 13:17:32 crc kubenswrapper[4749]: I1001 13:17:32.106421 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:17:32 crc kubenswrapper[4749]: I1001 13:17:32.108513 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:17:33 crc kubenswrapper[4749]: I1001 13:17:33.230464 4749 scope.go:117] "RemoveContainer" containerID="dafe8b3793369956dfb95caebe1bb42e3642c1ecebf130cd340a4642e7927aa7" Oct 01 13:17:33 crc kubenswrapper[4749]: I1001 13:17:33.655155 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nrgp7_33919a9e-1f0d-4127-915d-17d77d78853e/kube-multus/2.log" Oct 01 13:17:33 crc kubenswrapper[4749]: I1001 13:17:33.655528 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nrgp7" event={"ID":"33919a9e-1f0d-4127-915d-17d77d78853e","Type":"ContainerStarted","Data":"049f8b954d5407bda0eb922454071eaf1d57abd000a198cf5bc1e3a9f741e4fc"} Oct 01 13:17:36 crc kubenswrapper[4749]: I1001 13:17:36.625171 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-t7d6l" Oct 01 13:17:41 crc kubenswrapper[4749]: I1001 13:17:41.229526 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:41 crc kubenswrapper[4749]: I1001 13:17:41.235004 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:41 crc kubenswrapper[4749]: I1001 13:17:41.492839 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c"] Oct 01 13:17:41 crc kubenswrapper[4749]: I1001 13:17:41.711210 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" event={"ID":"2629ec5a-9fe4-4220-812b-d5c9597e5363","Type":"ContainerStarted","Data":"6ed76c7041c6f53d1655cf5e7e9a9b977ccb3f23d2e70b8b25ed28b098b22bbc"} Oct 01 13:17:41 crc kubenswrapper[4749]: I1001 13:17:41.711275 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" event={"ID":"2629ec5a-9fe4-4220-812b-d5c9597e5363","Type":"ContainerStarted","Data":"72975e6d330fcd24724d39101244c283295d324712faaf1765739758c36aa0e1"} Oct 01 13:17:42 crc kubenswrapper[4749]: I1001 13:17:42.720277 4749 generic.go:334] "Generic (PLEG): container finished" podID="2629ec5a-9fe4-4220-812b-d5c9597e5363" containerID="6ed76c7041c6f53d1655cf5e7e9a9b977ccb3f23d2e70b8b25ed28b098b22bbc" exitCode=0 Oct 01 13:17:42 crc kubenswrapper[4749]: I1001 13:17:42.720448 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" event={"ID":"2629ec5a-9fe4-4220-812b-d5c9597e5363","Type":"ContainerDied","Data":"6ed76c7041c6f53d1655cf5e7e9a9b977ccb3f23d2e70b8b25ed28b098b22bbc"} Oct 01 13:17:44 crc kubenswrapper[4749]: I1001 13:17:44.736909 4749 generic.go:334] "Generic (PLEG): container finished" podID="2629ec5a-9fe4-4220-812b-d5c9597e5363" containerID="d1744e14b81d9426ab61dc7c8e143b5b3d56e6133e5844902d119f483e6dba35" exitCode=0 Oct 01 13:17:44 crc kubenswrapper[4749]: I1001 13:17:44.737025 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" event={"ID":"2629ec5a-9fe4-4220-812b-d5c9597e5363","Type":"ContainerDied","Data":"d1744e14b81d9426ab61dc7c8e143b5b3d56e6133e5844902d119f483e6dba35"} Oct 01 13:17:45 crc kubenswrapper[4749]: I1001 13:17:45.750329 4749 generic.go:334] "Generic (PLEG): container finished" podID="2629ec5a-9fe4-4220-812b-d5c9597e5363" containerID="3eb869d9d2145b8dd16f812ec043099c6d0fc5a8149839c34547b9b56efb6680" exitCode=0 Oct 01 13:17:45 crc kubenswrapper[4749]: I1001 13:17:45.750393 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" event={"ID":"2629ec5a-9fe4-4220-812b-d5c9597e5363","Type":"ContainerDied","Data":"3eb869d9d2145b8dd16f812ec043099c6d0fc5a8149839c34547b9b56efb6680"} Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.090979 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.152185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2629ec5a-9fe4-4220-812b-d5c9597e5363-bundle\") pod \"2629ec5a-9fe4-4220-812b-d5c9597e5363\" (UID: \"2629ec5a-9fe4-4220-812b-d5c9597e5363\") " Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.152255 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2629ec5a-9fe4-4220-812b-d5c9597e5363-util\") pod \"2629ec5a-9fe4-4220-812b-d5c9597e5363\" (UID: \"2629ec5a-9fe4-4220-812b-d5c9597e5363\") " Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.152297 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mdxc\" (UniqueName: \"kubernetes.io/projected/2629ec5a-9fe4-4220-812b-d5c9597e5363-kube-api-access-5mdxc\") pod \"2629ec5a-9fe4-4220-812b-d5c9597e5363\" (UID: \"2629ec5a-9fe4-4220-812b-d5c9597e5363\") " Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.156638 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2629ec5a-9fe4-4220-812b-d5c9597e5363-bundle" (OuterVolumeSpecName: "bundle") pod "2629ec5a-9fe4-4220-812b-d5c9597e5363" (UID: "2629ec5a-9fe4-4220-812b-d5c9597e5363"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.159918 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2629ec5a-9fe4-4220-812b-d5c9597e5363-kube-api-access-5mdxc" (OuterVolumeSpecName: "kube-api-access-5mdxc") pod "2629ec5a-9fe4-4220-812b-d5c9597e5363" (UID: "2629ec5a-9fe4-4220-812b-d5c9597e5363"). InnerVolumeSpecName "kube-api-access-5mdxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.253176 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2629ec5a-9fe4-4220-812b-d5c9597e5363-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.253229 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mdxc\" (UniqueName: \"kubernetes.io/projected/2629ec5a-9fe4-4220-812b-d5c9597e5363-kube-api-access-5mdxc\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.536304 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2629ec5a-9fe4-4220-812b-d5c9597e5363-util" (OuterVolumeSpecName: "util") pod "2629ec5a-9fe4-4220-812b-d5c9597e5363" (UID: "2629ec5a-9fe4-4220-812b-d5c9597e5363"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.557003 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2629ec5a-9fe4-4220-812b-d5c9597e5363-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.768010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" event={"ID":"2629ec5a-9fe4-4220-812b-d5c9597e5363","Type":"ContainerDied","Data":"72975e6d330fcd24724d39101244c283295d324712faaf1765739758c36aa0e1"} Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.768094 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72975e6d330fcd24724d39101244c283295d324712faaf1765739758c36aa0e1" Oct 01 13:17:47 crc kubenswrapper[4749]: I1001 13:17:47.768097 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.038239 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-jmcm2"] Oct 01 13:17:59 crc kubenswrapper[4749]: E1001 13:17:59.038916 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2629ec5a-9fe4-4220-812b-d5c9597e5363" containerName="pull" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.038931 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2629ec5a-9fe4-4220-812b-d5c9597e5363" containerName="pull" Oct 01 13:17:59 crc kubenswrapper[4749]: E1001 13:17:59.038947 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2629ec5a-9fe4-4220-812b-d5c9597e5363" containerName="extract" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.038955 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2629ec5a-9fe4-4220-812b-d5c9597e5363" containerName="extract" Oct 01 13:17:59 crc kubenswrapper[4749]: E1001 13:17:59.038982 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2629ec5a-9fe4-4220-812b-d5c9597e5363" containerName="util" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.038990 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2629ec5a-9fe4-4220-812b-d5c9597e5363" containerName="util" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.039111 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2629ec5a-9fe4-4220-812b-d5c9597e5363" containerName="extract" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.039539 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-jmcm2" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.042673 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rtff4" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.042898 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.043096 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.063746 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-jmcm2"] Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.121667 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4"] Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.122540 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.125119 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.125355 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-rzx72" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.135326 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r99h\" (UniqueName: \"kubernetes.io/projected/74ad4854-4091-4485-b4da-881846999f3b-kube-api-access-2r99h\") pod \"obo-prometheus-operator-7c8cf85677-jmcm2\" (UID: \"74ad4854-4091-4485-b4da-881846999f3b\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-jmcm2" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.148149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4"] Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.151002 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn"] Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.151639 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.161153 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn"] Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.236738 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r99h\" (UniqueName: \"kubernetes.io/projected/74ad4854-4091-4485-b4da-881846999f3b-kube-api-access-2r99h\") pod \"obo-prometheus-operator-7c8cf85677-jmcm2\" (UID: \"74ad4854-4091-4485-b4da-881846999f3b\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-jmcm2" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.236797 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2393602c-447f-4166-8ce8-7cb58c8d5510-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4\" (UID: \"2393602c-447f-4166-8ce8-7cb58c8d5510\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.236826 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2393602c-447f-4166-8ce8-7cb58c8d5510-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4\" (UID: \"2393602c-447f-4166-8ce8-7cb58c8d5510\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.236865 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f69075a9-9209-4d56-8111-2bdcd4dc52e6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn\" (UID: \"f69075a9-9209-4d56-8111-2bdcd4dc52e6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.236886 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f69075a9-9209-4d56-8111-2bdcd4dc52e6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn\" (UID: \"f69075a9-9209-4d56-8111-2bdcd4dc52e6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.255128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r99h\" (UniqueName: \"kubernetes.io/projected/74ad4854-4091-4485-b4da-881846999f3b-kube-api-access-2r99h\") pod \"obo-prometheus-operator-7c8cf85677-jmcm2\" (UID: \"74ad4854-4091-4485-b4da-881846999f3b\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-jmcm2" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.270315 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-v579p"] Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.271143 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-v579p" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.272805 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-5vwn9" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.273007 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.285259 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-v579p"] Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.338525 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/698fb753-09d8-462d-a57d-95b1cb6bae9a-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-v579p\" (UID: \"698fb753-09d8-462d-a57d-95b1cb6bae9a\") " pod="openshift-operators/observability-operator-cc5f78dfc-v579p" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.338786 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2393602c-447f-4166-8ce8-7cb58c8d5510-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4\" (UID: \"2393602c-447f-4166-8ce8-7cb58c8d5510\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.338832 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2393602c-447f-4166-8ce8-7cb58c8d5510-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4\" (UID: \"2393602c-447f-4166-8ce8-7cb58c8d5510\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.338880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ftlq\" (UniqueName: \"kubernetes.io/projected/698fb753-09d8-462d-a57d-95b1cb6bae9a-kube-api-access-6ftlq\") pod \"observability-operator-cc5f78dfc-v579p\" (UID: \"698fb753-09d8-462d-a57d-95b1cb6bae9a\") " pod="openshift-operators/observability-operator-cc5f78dfc-v579p" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.338910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f69075a9-9209-4d56-8111-2bdcd4dc52e6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn\" (UID: \"f69075a9-9209-4d56-8111-2bdcd4dc52e6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.338937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f69075a9-9209-4d56-8111-2bdcd4dc52e6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn\" (UID: \"f69075a9-9209-4d56-8111-2bdcd4dc52e6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.342780 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2393602c-447f-4166-8ce8-7cb58c8d5510-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4\" (UID: \"2393602c-447f-4166-8ce8-7cb58c8d5510\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.342875 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2393602c-447f-4166-8ce8-7cb58c8d5510-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4\" (UID: \"2393602c-447f-4166-8ce8-7cb58c8d5510\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.343397 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f69075a9-9209-4d56-8111-2bdcd4dc52e6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn\" (UID: \"f69075a9-9209-4d56-8111-2bdcd4dc52e6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.343532 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f69075a9-9209-4d56-8111-2bdcd4dc52e6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn\" (UID: \"f69075a9-9209-4d56-8111-2bdcd4dc52e6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.357866 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-jmcm2" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.436632 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.443869 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/698fb753-09d8-462d-a57d-95b1cb6bae9a-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-v579p\" (UID: \"698fb753-09d8-462d-a57d-95b1cb6bae9a\") " pod="openshift-operators/observability-operator-cc5f78dfc-v579p" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.443961 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ftlq\" (UniqueName: \"kubernetes.io/projected/698fb753-09d8-462d-a57d-95b1cb6bae9a-kube-api-access-6ftlq\") pod \"observability-operator-cc5f78dfc-v579p\" (UID: \"698fb753-09d8-462d-a57d-95b1cb6bae9a\") " pod="openshift-operators/observability-operator-cc5f78dfc-v579p" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.449511 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/698fb753-09d8-462d-a57d-95b1cb6bae9a-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-v579p\" (UID: \"698fb753-09d8-462d-a57d-95b1cb6bae9a\") " pod="openshift-operators/observability-operator-cc5f78dfc-v579p" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.465523 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.474797 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-995fk"] Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.475687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-995fk" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.478345 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-sxqwf" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.482873 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ftlq\" (UniqueName: \"kubernetes.io/projected/698fb753-09d8-462d-a57d-95b1cb6bae9a-kube-api-access-6ftlq\") pod \"observability-operator-cc5f78dfc-v579p\" (UID: \"698fb753-09d8-462d-a57d-95b1cb6bae9a\") " pod="openshift-operators/observability-operator-cc5f78dfc-v579p" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.484184 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-995fk"] Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.546561 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/82f26847-56e4-48f0-b990-e2f4e8c9cfd6-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-995fk\" (UID: \"82f26847-56e4-48f0-b990-e2f4e8c9cfd6\") " pod="openshift-operators/perses-operator-54bc95c9fb-995fk" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.546663 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpt7x\" (UniqueName: \"kubernetes.io/projected/82f26847-56e4-48f0-b990-e2f4e8c9cfd6-kube-api-access-tpt7x\") pod \"perses-operator-54bc95c9fb-995fk\" (UID: \"82f26847-56e4-48f0-b990-e2f4e8c9cfd6\") " pod="openshift-operators/perses-operator-54bc95c9fb-995fk" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.584663 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-v579p" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.598857 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-jmcm2"] Oct 01 13:17:59 crc kubenswrapper[4749]: W1001 13:17:59.612789 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74ad4854_4091_4485_b4da_881846999f3b.slice/crio-941055ca5371f71035f4d2e84191ef97894a8a6084bbe9eb990eebc7bc802694 WatchSource:0}: Error finding container 941055ca5371f71035f4d2e84191ef97894a8a6084bbe9eb990eebc7bc802694: Status 404 returned error can't find the container with id 941055ca5371f71035f4d2e84191ef97894a8a6084bbe9eb990eebc7bc802694 Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.649105 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpt7x\" (UniqueName: \"kubernetes.io/projected/82f26847-56e4-48f0-b990-e2f4e8c9cfd6-kube-api-access-tpt7x\") pod \"perses-operator-54bc95c9fb-995fk\" (UID: \"82f26847-56e4-48f0-b990-e2f4e8c9cfd6\") " pod="openshift-operators/perses-operator-54bc95c9fb-995fk" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.649163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/82f26847-56e4-48f0-b990-e2f4e8c9cfd6-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-995fk\" (UID: \"82f26847-56e4-48f0-b990-e2f4e8c9cfd6\") " pod="openshift-operators/perses-operator-54bc95c9fb-995fk" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.650098 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/82f26847-56e4-48f0-b990-e2f4e8c9cfd6-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-995fk\" (UID: \"82f26847-56e4-48f0-b990-e2f4e8c9cfd6\") " pod="openshift-operators/perses-operator-54bc95c9fb-995fk" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.666833 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpt7x\" (UniqueName: \"kubernetes.io/projected/82f26847-56e4-48f0-b990-e2f4e8c9cfd6-kube-api-access-tpt7x\") pod \"perses-operator-54bc95c9fb-995fk\" (UID: \"82f26847-56e4-48f0-b990-e2f4e8c9cfd6\") " pod="openshift-operators/perses-operator-54bc95c9fb-995fk" Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.683747 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4"] Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.791048 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-v579p"] Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.798699 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-995fk" Oct 01 13:17:59 crc kubenswrapper[4749]: W1001 13:17:59.806621 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod698fb753_09d8_462d_a57d_95b1cb6bae9a.slice/crio-15ada00008c96962607621a697ebb412c4eb135d34f8c9e95593f511cbbdf972 WatchSource:0}: Error finding container 15ada00008c96962607621a697ebb412c4eb135d34f8c9e95593f511cbbdf972: Status 404 returned error can't find the container with id 15ada00008c96962607621a697ebb412c4eb135d34f8c9e95593f511cbbdf972 Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.836762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4" event={"ID":"2393602c-447f-4166-8ce8-7cb58c8d5510","Type":"ContainerStarted","Data":"d8df71444f50733dc5c60a52dbf702f8910d0448da41200570e93079feb60857"} Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.837930 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-v579p" event={"ID":"698fb753-09d8-462d-a57d-95b1cb6bae9a","Type":"ContainerStarted","Data":"15ada00008c96962607621a697ebb412c4eb135d34f8c9e95593f511cbbdf972"} Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.839038 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-jmcm2" event={"ID":"74ad4854-4091-4485-b4da-881846999f3b","Type":"ContainerStarted","Data":"941055ca5371f71035f4d2e84191ef97894a8a6084bbe9eb990eebc7bc802694"} Oct 01 13:17:59 crc kubenswrapper[4749]: I1001 13:17:59.941625 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn"] Oct 01 13:17:59 crc kubenswrapper[4749]: W1001 13:17:59.958747 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf69075a9_9209_4d56_8111_2bdcd4dc52e6.slice/crio-c212c1c612fb5033d9db0f8209925d607829560d1d6c37b8c3f8bb570ebd1e0c WatchSource:0}: Error finding container c212c1c612fb5033d9db0f8209925d607829560d1d6c37b8c3f8bb570ebd1e0c: Status 404 returned error can't find the container with id c212c1c612fb5033d9db0f8209925d607829560d1d6c37b8c3f8bb570ebd1e0c Oct 01 13:18:00 crc kubenswrapper[4749]: W1001 13:18:00.017852 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82f26847_56e4_48f0_b990_e2f4e8c9cfd6.slice/crio-c72bd6bfc271558b67c30e13347935c878502fdd5370a8e9518f41d374e2cadd WatchSource:0}: Error finding container c72bd6bfc271558b67c30e13347935c878502fdd5370a8e9518f41d374e2cadd: Status 404 returned error can't find the container with id c72bd6bfc271558b67c30e13347935c878502fdd5370a8e9518f41d374e2cadd Oct 01 13:18:00 crc kubenswrapper[4749]: I1001 13:18:00.019633 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-995fk"] Oct 01 13:18:00 crc kubenswrapper[4749]: I1001 13:18:00.846287 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn" event={"ID":"f69075a9-9209-4d56-8111-2bdcd4dc52e6","Type":"ContainerStarted","Data":"c212c1c612fb5033d9db0f8209925d607829560d1d6c37b8c3f8bb570ebd1e0c"} Oct 01 13:18:00 crc kubenswrapper[4749]: I1001 13:18:00.847380 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-995fk" event={"ID":"82f26847-56e4-48f0-b990-e2f4e8c9cfd6","Type":"ContainerStarted","Data":"c72bd6bfc271558b67c30e13347935c878502fdd5370a8e9518f41d374e2cadd"} Oct 01 13:18:02 crc kubenswrapper[4749]: I1001 13:18:02.106758 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:18:02 crc kubenswrapper[4749]: I1001 13:18:02.106809 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:18:12 crc kubenswrapper[4749]: I1001 13:18:12.839902 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6smx5"] Oct 01 13:18:12 crc kubenswrapper[4749]: I1001 13:18:12.840522 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" podUID="e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" containerName="controller-manager" containerID="cri-o://03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e" gracePeriod=30 Oct 01 13:18:12 crc kubenswrapper[4749]: I1001 13:18:12.873045 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5"] Oct 01 13:18:12 crc kubenswrapper[4749]: I1001 13:18:12.873253 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" podUID="30a10511-6a81-4150-9ae3-976a8062accc" containerName="route-controller-manager" containerID="cri-o://a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29" gracePeriod=30 Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.381348 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.400794 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.487788 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpwn2\" (UniqueName: \"kubernetes.io/projected/30a10511-6a81-4150-9ae3-976a8062accc-kube-api-access-hpwn2\") pod \"30a10511-6a81-4150-9ae3-976a8062accc\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.487856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30a10511-6a81-4150-9ae3-976a8062accc-config\") pod \"30a10511-6a81-4150-9ae3-976a8062accc\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.487884 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30a10511-6a81-4150-9ae3-976a8062accc-client-ca\") pod \"30a10511-6a81-4150-9ae3-976a8062accc\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.487942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a10511-6a81-4150-9ae3-976a8062accc-serving-cert\") pod \"30a10511-6a81-4150-9ae3-976a8062accc\" (UID: \"30a10511-6a81-4150-9ae3-976a8062accc\") " Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.488636 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30a10511-6a81-4150-9ae3-976a8062accc-client-ca" (OuterVolumeSpecName: "client-ca") pod "30a10511-6a81-4150-9ae3-976a8062accc" (UID: "30a10511-6a81-4150-9ae3-976a8062accc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.488678 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30a10511-6a81-4150-9ae3-976a8062accc-config" (OuterVolumeSpecName: "config") pod "30a10511-6a81-4150-9ae3-976a8062accc" (UID: "30a10511-6a81-4150-9ae3-976a8062accc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.493134 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a10511-6a81-4150-9ae3-976a8062accc-kube-api-access-hpwn2" (OuterVolumeSpecName: "kube-api-access-hpwn2") pod "30a10511-6a81-4150-9ae3-976a8062accc" (UID: "30a10511-6a81-4150-9ae3-976a8062accc"). InnerVolumeSpecName "kube-api-access-hpwn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.493156 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a10511-6a81-4150-9ae3-976a8062accc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "30a10511-6a81-4150-9ae3-976a8062accc" (UID: "30a10511-6a81-4150-9ae3-976a8062accc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.588511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7t5q\" (UniqueName: \"kubernetes.io/projected/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-kube-api-access-x7t5q\") pod \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.588822 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-proxy-ca-bundles\") pod \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.588851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-serving-cert\") pod \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.588894 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-config\") pod \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.588925 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-client-ca\") pod \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\" (UID: \"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565\") " Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.589151 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpwn2\" (UniqueName: \"kubernetes.io/projected/30a10511-6a81-4150-9ae3-976a8062accc-kube-api-access-hpwn2\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.589167 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30a10511-6a81-4150-9ae3-976a8062accc-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.589177 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/30a10511-6a81-4150-9ae3-976a8062accc-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.589186 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a10511-6a81-4150-9ae3-976a8062accc-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.589636 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-client-ca" (OuterVolumeSpecName: "client-ca") pod "e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" (UID: "e21ba3a7-8bf6-4f0a-b614-8bfc29a30565"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.589677 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" (UID: "e21ba3a7-8bf6-4f0a-b614-8bfc29a30565"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.589806 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-config" (OuterVolumeSpecName: "config") pod "e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" (UID: "e21ba3a7-8bf6-4f0a-b614-8bfc29a30565"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.592567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" (UID: "e21ba3a7-8bf6-4f0a-b614-8bfc29a30565"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.594270 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-kube-api-access-x7t5q" (OuterVolumeSpecName: "kube-api-access-x7t5q") pod "e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" (UID: "e21ba3a7-8bf6-4f0a-b614-8bfc29a30565"). InnerVolumeSpecName "kube-api-access-x7t5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.690757 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7t5q\" (UniqueName: \"kubernetes.io/projected/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-kube-api-access-x7t5q\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.690791 4749 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.690800 4749 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.690809 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.690817 4749 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.941005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-v579p" event={"ID":"698fb753-09d8-462d-a57d-95b1cb6bae9a","Type":"ContainerStarted","Data":"38dc73a39d59d9b5237420c61bac379ce70e25aa80d6eec23ff75298c45bfcd1"} Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.941366 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-v579p" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.942130 4749 generic.go:334] "Generic (PLEG): container finished" podID="e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" containerID="03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e" exitCode=0 Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.942182 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.942191 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" event={"ID":"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565","Type":"ContainerDied","Data":"03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e"} Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.942230 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6smx5" event={"ID":"e21ba3a7-8bf6-4f0a-b614-8bfc29a30565","Type":"ContainerDied","Data":"cf00cee0fd01789c717b276ef4831cdc2a24d61d9d0dae82775f7cb481f0a48e"} Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.942247 4749 scope.go:117] "RemoveContainer" containerID="03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.944054 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-v579p" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.944077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn" event={"ID":"f69075a9-9209-4d56-8111-2bdcd4dc52e6","Type":"ContainerStarted","Data":"4110693976d61273a6c63a6002950f6811a11813d080f143af890070b3d20fb6"} Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.958488 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-jmcm2" event={"ID":"74ad4854-4091-4485-b4da-881846999f3b","Type":"ContainerStarted","Data":"f67fed04373d667a81fcf17003f3c3477e1fbca0e97dfb9e7eb92154446cd1f8"} Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.967086 4749 generic.go:334] "Generic (PLEG): container finished" podID="30a10511-6a81-4150-9ae3-976a8062accc" containerID="a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29" exitCode=0 Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.967240 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.967939 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" event={"ID":"30a10511-6a81-4150-9ae3-976a8062accc","Type":"ContainerDied","Data":"a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29"} Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.968004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5" event={"ID":"30a10511-6a81-4150-9ae3-976a8062accc","Type":"ContainerDied","Data":"fc802df61ae6788dc94c50aaf861131e56a6bb6b0668429b62e7616db74be6b6"} Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.975931 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-995fk" event={"ID":"82f26847-56e4-48f0-b990-e2f4e8c9cfd6","Type":"ContainerStarted","Data":"6da23d587dfeee60612cfad1842ffb3fbc32b5b21956d9fb481924b2f6ea2722"} Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.976511 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-995fk" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.983675 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4" event={"ID":"2393602c-447f-4166-8ce8-7cb58c8d5510","Type":"ContainerStarted","Data":"48239bef30fc3158fd73bfc297314b85e808f4c610bbbe8c007af13551fa9181"} Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.991334 4749 scope.go:117] "RemoveContainer" containerID="03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e" Oct 01 13:18:13 crc kubenswrapper[4749]: E1001 13:18:13.995293 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e\": container with ID starting with 03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e not found: ID does not exist" containerID="03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.995323 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e"} err="failed to get container status \"03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e\": rpc error: code = NotFound desc = could not find container \"03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e\": container with ID starting with 03d5658b71fd49d65d5ba637569f27e929efb25198f150f3a91eb579fd5de13e not found: ID does not exist" Oct 01 13:18:13 crc kubenswrapper[4749]: I1001 13:18:13.995343 4749 scope.go:117] "RemoveContainer" containerID="a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.027014 4749 scope.go:117] "RemoveContainer" containerID="a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29" Oct 01 13:18:14 crc kubenswrapper[4749]: E1001 13:18:14.031912 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29\": container with ID starting with a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29 not found: ID does not exist" containerID="a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.031946 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29"} err="failed to get container status \"a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29\": rpc error: code = NotFound desc = could not find container \"a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29\": container with ID starting with a35d5ece2f40860aeb64aa055e3d3c72a60d28af79d37250aa58d179625beb29 not found: ID does not exist" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.050392 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn" podStartSLOduration=2.055822464 podStartE2EDuration="15.050366061s" podCreationTimestamp="2025-10-01 13:17:59 +0000 UTC" firstStartedPulling="2025-10-01 13:17:59.962027201 +0000 UTC m=+740.016012100" lastFinishedPulling="2025-10-01 13:18:12.956570798 +0000 UTC m=+753.010555697" observedRunningTime="2025-10-01 13:18:14.046605738 +0000 UTC m=+754.100590637" watchObservedRunningTime="2025-10-01 13:18:14.050366061 +0000 UTC m=+754.104350960" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.050718 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-v579p" podStartSLOduration=1.8420526430000002 podStartE2EDuration="15.050712581s" podCreationTimestamp="2025-10-01 13:17:59 +0000 UTC" firstStartedPulling="2025-10-01 13:17:59.816357357 +0000 UTC m=+739.870342256" lastFinishedPulling="2025-10-01 13:18:13.025017295 +0000 UTC m=+753.079002194" observedRunningTime="2025-10-01 13:18:14.019449354 +0000 UTC m=+754.073434253" watchObservedRunningTime="2025-10-01 13:18:14.050712581 +0000 UTC m=+754.104697480" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.063163 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6smx5"] Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.065764 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6smx5"] Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.082677 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4" podStartSLOduration=1.826195427 podStartE2EDuration="15.082666837s" podCreationTimestamp="2025-10-01 13:17:59 +0000 UTC" firstStartedPulling="2025-10-01 13:17:59.695902413 +0000 UTC m=+739.749887312" lastFinishedPulling="2025-10-01 13:18:12.952373823 +0000 UTC m=+753.006358722" observedRunningTime="2025-10-01 13:18:14.08057112 +0000 UTC m=+754.134556019" watchObservedRunningTime="2025-10-01 13:18:14.082666837 +0000 UTC m=+754.136651736" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.106969 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-995fk" podStartSLOduration=2.120161629 podStartE2EDuration="15.106953163s" podCreationTimestamp="2025-10-01 13:17:59 +0000 UTC" firstStartedPulling="2025-10-01 13:18:00.020457173 +0000 UTC m=+740.074442072" lastFinishedPulling="2025-10-01 13:18:13.007248707 +0000 UTC m=+753.061233606" observedRunningTime="2025-10-01 13:18:14.103987332 +0000 UTC m=+754.157972231" watchObservedRunningTime="2025-10-01 13:18:14.106953163 +0000 UTC m=+754.160938052" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.123262 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-jmcm2" podStartSLOduration=1.732760666 podStartE2EDuration="15.1232467s" podCreationTimestamp="2025-10-01 13:17:59 +0000 UTC" firstStartedPulling="2025-10-01 13:17:59.615968092 +0000 UTC m=+739.669952991" lastFinishedPulling="2025-10-01 13:18:13.006454126 +0000 UTC m=+753.060439025" observedRunningTime="2025-10-01 13:18:14.120775672 +0000 UTC m=+754.174760571" watchObservedRunningTime="2025-10-01 13:18:14.1232467 +0000 UTC m=+754.177231599" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.150074 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5"] Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.152875 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4v2g5"] Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.851943 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6"] Oct 01 13:18:14 crc kubenswrapper[4749]: E1001 13:18:14.852190 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" containerName="controller-manager" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.852202 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" containerName="controller-manager" Oct 01 13:18:14 crc kubenswrapper[4749]: E1001 13:18:14.852232 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a10511-6a81-4150-9ae3-976a8062accc" containerName="route-controller-manager" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.852240 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a10511-6a81-4150-9ae3-976a8062accc" containerName="route-controller-manager" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.852331 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a10511-6a81-4150-9ae3-976a8062accc" containerName="route-controller-manager" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.852349 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" containerName="controller-manager" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.852776 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.854894 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f6cbf8c45-c548f"] Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.855640 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.859311 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.859832 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.859864 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.859901 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.860485 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.860537 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.860638 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.866142 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.866396 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.866585 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.866704 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.870120 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6"] Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.873096 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.876711 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 13:18:14 crc kubenswrapper[4749]: I1001 13:18:14.926569 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f6cbf8c45-c548f"] Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.004581 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24b825f-683f-49e4-aa14-54edc1e8214f-config\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.004664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d24b825f-683f-49e4-aa14-54edc1e8214f-proxy-ca-bundles\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.004701 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d24b825f-683f-49e4-aa14-54edc1e8214f-serving-cert\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.004730 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhfxf\" (UniqueName: \"kubernetes.io/projected/d24b825f-683f-49e4-aa14-54edc1e8214f-kube-api-access-nhfxf\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.004786 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02944e9a-d708-48ca-90bc-ae606336c742-client-ca\") pod \"route-controller-manager-7b696d8865-gkjd6\" (UID: \"02944e9a-d708-48ca-90bc-ae606336c742\") " pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.004820 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02944e9a-d708-48ca-90bc-ae606336c742-config\") pod \"route-controller-manager-7b696d8865-gkjd6\" (UID: \"02944e9a-d708-48ca-90bc-ae606336c742\") " pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.004892 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d24b825f-683f-49e4-aa14-54edc1e8214f-client-ca\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.004958 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l9dr\" (UniqueName: \"kubernetes.io/projected/02944e9a-d708-48ca-90bc-ae606336c742-kube-api-access-8l9dr\") pod \"route-controller-manager-7b696d8865-gkjd6\" (UID: \"02944e9a-d708-48ca-90bc-ae606336c742\") " pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.005016 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02944e9a-d708-48ca-90bc-ae606336c742-serving-cert\") pod \"route-controller-manager-7b696d8865-gkjd6\" (UID: \"02944e9a-d708-48ca-90bc-ae606336c742\") " pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.105974 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d24b825f-683f-49e4-aa14-54edc1e8214f-serving-cert\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.106042 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhfxf\" (UniqueName: \"kubernetes.io/projected/d24b825f-683f-49e4-aa14-54edc1e8214f-kube-api-access-nhfxf\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.106429 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02944e9a-d708-48ca-90bc-ae606336c742-client-ca\") pod \"route-controller-manager-7b696d8865-gkjd6\" (UID: \"02944e9a-d708-48ca-90bc-ae606336c742\") " pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.106491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02944e9a-d708-48ca-90bc-ae606336c742-config\") pod \"route-controller-manager-7b696d8865-gkjd6\" (UID: \"02944e9a-d708-48ca-90bc-ae606336c742\") " pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.106510 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d24b825f-683f-49e4-aa14-54edc1e8214f-client-ca\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.107210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/02944e9a-d708-48ca-90bc-ae606336c742-client-ca\") pod \"route-controller-manager-7b696d8865-gkjd6\" (UID: \"02944e9a-d708-48ca-90bc-ae606336c742\") " pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.107502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d24b825f-683f-49e4-aa14-54edc1e8214f-client-ca\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.107531 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02944e9a-d708-48ca-90bc-ae606336c742-config\") pod \"route-controller-manager-7b696d8865-gkjd6\" (UID: \"02944e9a-d708-48ca-90bc-ae606336c742\") " pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.106746 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l9dr\" (UniqueName: \"kubernetes.io/projected/02944e9a-d708-48ca-90bc-ae606336c742-kube-api-access-8l9dr\") pod \"route-controller-manager-7b696d8865-gkjd6\" (UID: \"02944e9a-d708-48ca-90bc-ae606336c742\") " pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.107646 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02944e9a-d708-48ca-90bc-ae606336c742-serving-cert\") pod \"route-controller-manager-7b696d8865-gkjd6\" (UID: \"02944e9a-d708-48ca-90bc-ae606336c742\") " pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.107852 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24b825f-683f-49e4-aa14-54edc1e8214f-config\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.108152 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d24b825f-683f-49e4-aa14-54edc1e8214f-proxy-ca-bundles\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.108915 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d24b825f-683f-49e4-aa14-54edc1e8214f-proxy-ca-bundles\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.109332 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d24b825f-683f-49e4-aa14-54edc1e8214f-serving-cert\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.109504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d24b825f-683f-49e4-aa14-54edc1e8214f-config\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.112717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02944e9a-d708-48ca-90bc-ae606336c742-serving-cert\") pod \"route-controller-manager-7b696d8865-gkjd6\" (UID: \"02944e9a-d708-48ca-90bc-ae606336c742\") " pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.132824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhfxf\" (UniqueName: \"kubernetes.io/projected/d24b825f-683f-49e4-aa14-54edc1e8214f-kube-api-access-nhfxf\") pod \"controller-manager-6f6cbf8c45-c548f\" (UID: \"d24b825f-683f-49e4-aa14-54edc1e8214f\") " pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.140874 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l9dr\" (UniqueName: \"kubernetes.io/projected/02944e9a-d708-48ca-90bc-ae606336c742-kube-api-access-8l9dr\") pod \"route-controller-manager-7b696d8865-gkjd6\" (UID: \"02944e9a-d708-48ca-90bc-ae606336c742\") " pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.169500 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.177039 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.238011 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30a10511-6a81-4150-9ae3-976a8062accc" path="/var/lib/kubelet/pods/30a10511-6a81-4150-9ae3-976a8062accc/volumes" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.238956 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e21ba3a7-8bf6-4f0a-b614-8bfc29a30565" path="/var/lib/kubelet/pods/e21ba3a7-8bf6-4f0a-b614-8bfc29a30565/volumes" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.418728 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f6cbf8c45-c548f"] Oct 01 13:18:15 crc kubenswrapper[4749]: W1001 13:18:15.425279 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd24b825f_683f_49e4_aa14_54edc1e8214f.slice/crio-099260244a25d86ff4ec6c8f51876d8d0ef0eb6ea0f66ba2ec34901f10a4c64c WatchSource:0}: Error finding container 099260244a25d86ff4ec6c8f51876d8d0ef0eb6ea0f66ba2ec34901f10a4c64c: Status 404 returned error can't find the container with id 099260244a25d86ff4ec6c8f51876d8d0ef0eb6ea0f66ba2ec34901f10a4c64c Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.576702 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6"] Oct 01 13:18:15 crc kubenswrapper[4749]: W1001 13:18:15.586430 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02944e9a_d708_48ca_90bc_ae606336c742.slice/crio-47f0650893c94dff22be3a69b7c38ac440843cdbb98e09106be762363d51cbb7 WatchSource:0}: Error finding container 47f0650893c94dff22be3a69b7c38ac440843cdbb98e09106be762363d51cbb7: Status 404 returned error can't find the container with id 47f0650893c94dff22be3a69b7c38ac440843cdbb98e09106be762363d51cbb7 Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.995688 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" event={"ID":"02944e9a-d708-48ca-90bc-ae606336c742","Type":"ContainerStarted","Data":"9456c31a63f38a096230c3c604e1e6a392c924cc4901d58bccb867b4988f8009"} Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.995729 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" event={"ID":"02944e9a-d708-48ca-90bc-ae606336c742","Type":"ContainerStarted","Data":"47f0650893c94dff22be3a69b7c38ac440843cdbb98e09106be762363d51cbb7"} Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.996015 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.998480 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" event={"ID":"d24b825f-683f-49e4-aa14-54edc1e8214f","Type":"ContainerStarted","Data":"8a8ad226f85b41321533683289eaa1da56220089895876afa9ff1b1936adc1ad"} Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.998504 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:15 crc kubenswrapper[4749]: I1001 13:18:15.998514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" event={"ID":"d24b825f-683f-49e4-aa14-54edc1e8214f","Type":"ContainerStarted","Data":"099260244a25d86ff4ec6c8f51876d8d0ef0eb6ea0f66ba2ec34901f10a4c64c"} Oct 01 13:18:16 crc kubenswrapper[4749]: I1001 13:18:16.003553 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" Oct 01 13:18:16 crc kubenswrapper[4749]: I1001 13:18:16.024091 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" podStartSLOduration=3.024075363 podStartE2EDuration="3.024075363s" podCreationTimestamp="2025-10-01 13:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:18:16.010053828 +0000 UTC m=+756.064038727" watchObservedRunningTime="2025-10-01 13:18:16.024075363 +0000 UTC m=+756.078060262" Oct 01 13:18:16 crc kubenswrapper[4749]: I1001 13:18:16.025579 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f6cbf8c45-c548f" podStartSLOduration=3.025574234 podStartE2EDuration="3.025574234s" podCreationTimestamp="2025-10-01 13:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:18:16.024488914 +0000 UTC m=+756.078473813" watchObservedRunningTime="2025-10-01 13:18:16.025574234 +0000 UTC m=+756.079559133" Oct 01 13:18:16 crc kubenswrapper[4749]: I1001 13:18:16.303431 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b696d8865-gkjd6" Oct 01 13:18:19 crc kubenswrapper[4749]: I1001 13:18:19.673926 4749 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 13:18:19 crc kubenswrapper[4749]: I1001 13:18:19.802388 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-995fk" Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.621080 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qsz45"] Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.626524 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.695244 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qsz45"] Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.783735 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14505f73-fb06-4168-b5f6-b2ce51d65809-catalog-content\") pod \"redhat-marketplace-qsz45\" (UID: \"14505f73-fb06-4168-b5f6-b2ce51d65809\") " pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.783798 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bh7d\" (UniqueName: \"kubernetes.io/projected/14505f73-fb06-4168-b5f6-b2ce51d65809-kube-api-access-9bh7d\") pod \"redhat-marketplace-qsz45\" (UID: \"14505f73-fb06-4168-b5f6-b2ce51d65809\") " pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.783896 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14505f73-fb06-4168-b5f6-b2ce51d65809-utilities\") pod \"redhat-marketplace-qsz45\" (UID: \"14505f73-fb06-4168-b5f6-b2ce51d65809\") " pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.884665 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14505f73-fb06-4168-b5f6-b2ce51d65809-catalog-content\") pod \"redhat-marketplace-qsz45\" (UID: \"14505f73-fb06-4168-b5f6-b2ce51d65809\") " pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.884730 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bh7d\" (UniqueName: \"kubernetes.io/projected/14505f73-fb06-4168-b5f6-b2ce51d65809-kube-api-access-9bh7d\") pod \"redhat-marketplace-qsz45\" (UID: \"14505f73-fb06-4168-b5f6-b2ce51d65809\") " pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.884798 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14505f73-fb06-4168-b5f6-b2ce51d65809-utilities\") pod \"redhat-marketplace-qsz45\" (UID: \"14505f73-fb06-4168-b5f6-b2ce51d65809\") " pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.885322 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14505f73-fb06-4168-b5f6-b2ce51d65809-catalog-content\") pod \"redhat-marketplace-qsz45\" (UID: \"14505f73-fb06-4168-b5f6-b2ce51d65809\") " pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.885391 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14505f73-fb06-4168-b5f6-b2ce51d65809-utilities\") pod \"redhat-marketplace-qsz45\" (UID: \"14505f73-fb06-4168-b5f6-b2ce51d65809\") " pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.905565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bh7d\" (UniqueName: \"kubernetes.io/projected/14505f73-fb06-4168-b5f6-b2ce51d65809-kube-api-access-9bh7d\") pod \"redhat-marketplace-qsz45\" (UID: \"14505f73-fb06-4168-b5f6-b2ce51d65809\") " pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:28 crc kubenswrapper[4749]: I1001 13:18:28.950295 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:29 crc kubenswrapper[4749]: I1001 13:18:29.425107 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qsz45"] Oct 01 13:18:30 crc kubenswrapper[4749]: I1001 13:18:30.085787 4749 generic.go:334] "Generic (PLEG): container finished" podID="14505f73-fb06-4168-b5f6-b2ce51d65809" containerID="d6e89865055a95086884e1a36aa67aa66890bca86464ced2ea57736e65fc5993" exitCode=0 Oct 01 13:18:30 crc kubenswrapper[4749]: I1001 13:18:30.086004 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qsz45" event={"ID":"14505f73-fb06-4168-b5f6-b2ce51d65809","Type":"ContainerDied","Data":"d6e89865055a95086884e1a36aa67aa66890bca86464ced2ea57736e65fc5993"} Oct 01 13:18:30 crc kubenswrapper[4749]: I1001 13:18:30.086173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qsz45" event={"ID":"14505f73-fb06-4168-b5f6-b2ce51d65809","Type":"ContainerStarted","Data":"f505476121b12e9d1344e8df4332394983bb68fd8763552654665ee3a8552657"} Oct 01 13:18:31 crc kubenswrapper[4749]: E1001 13:18:31.310484 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14505f73_fb06_4168_b5f6_b2ce51d65809.slice/crio-271fdb3b42521443f9f5bc11cd2fbb0e4714c26d7c7bf920216badbde183b4e2.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:18:32 crc kubenswrapper[4749]: I1001 13:18:32.103961 4749 generic.go:334] "Generic (PLEG): container finished" podID="14505f73-fb06-4168-b5f6-b2ce51d65809" containerID="271fdb3b42521443f9f5bc11cd2fbb0e4714c26d7c7bf920216badbde183b4e2" exitCode=0 Oct 01 13:18:32 crc kubenswrapper[4749]: I1001 13:18:32.104033 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qsz45" event={"ID":"14505f73-fb06-4168-b5f6-b2ce51d65809","Type":"ContainerDied","Data":"271fdb3b42521443f9f5bc11cd2fbb0e4714c26d7c7bf920216badbde183b4e2"} Oct 01 13:18:32 crc kubenswrapper[4749]: I1001 13:18:32.106742 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:18:32 crc kubenswrapper[4749]: I1001 13:18:32.106861 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:18:32 crc kubenswrapper[4749]: I1001 13:18:32.107044 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:18:32 crc kubenswrapper[4749]: I1001 13:18:32.107548 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf9f42cda1c41e2e8bbd8a36b3bc094b1e73bf93791da0833fab72799f05a62e"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:18:32 crc kubenswrapper[4749]: I1001 13:18:32.107681 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://bf9f42cda1c41e2e8bbd8a36b3bc094b1e73bf93791da0833fab72799f05a62e" gracePeriod=600 Oct 01 13:18:33 crc kubenswrapper[4749]: I1001 13:18:33.113238 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"bf9f42cda1c41e2e8bbd8a36b3bc094b1e73bf93791da0833fab72799f05a62e"} Oct 01 13:18:33 crc kubenswrapper[4749]: I1001 13:18:33.113621 4749 scope.go:117] "RemoveContainer" containerID="d37b6e61f0b3af0ecef6f3f6b8e1af5456836ce443313f97dfd8a281c3e0b927" Oct 01 13:18:33 crc kubenswrapper[4749]: I1001 13:18:33.113240 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="bf9f42cda1c41e2e8bbd8a36b3bc094b1e73bf93791da0833fab72799f05a62e" exitCode=0 Oct 01 13:18:33 crc kubenswrapper[4749]: I1001 13:18:33.113773 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"8d0168c71ffab7f25d1f8d0338f65483c24fac9eb00fb19dbac8b302415e4b3e"} Oct 01 13:18:33 crc kubenswrapper[4749]: I1001 13:18:33.119816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qsz45" event={"ID":"14505f73-fb06-4168-b5f6-b2ce51d65809","Type":"ContainerStarted","Data":"dc2954dadd92037029d74bf40bd8a12a04c66b9144ddf9919f1eac56f491ba15"} Oct 01 13:18:33 crc kubenswrapper[4749]: I1001 13:18:33.162945 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qsz45" podStartSLOduration=2.673179689 podStartE2EDuration="5.162926961s" podCreationTimestamp="2025-10-01 13:18:28 +0000 UTC" firstStartedPulling="2025-10-01 13:18:30.088737756 +0000 UTC m=+770.142722655" lastFinishedPulling="2025-10-01 13:18:32.578485018 +0000 UTC m=+772.632469927" observedRunningTime="2025-10-01 13:18:33.161071657 +0000 UTC m=+773.215056556" watchObservedRunningTime="2025-10-01 13:18:33.162926961 +0000 UTC m=+773.216911860" Oct 01 13:18:38 crc kubenswrapper[4749]: I1001 13:18:38.950626 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:38 crc kubenswrapper[4749]: I1001 13:18:38.951303 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.001756 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.164932 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9"] Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.172806 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.180352 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.195701 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9"] Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.242599 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.335548 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zjnh\" (UniqueName: \"kubernetes.io/projected/8544261d-2187-42db-a0ce-11ff55d6bff7-kube-api-access-8zjnh\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9\" (UID: \"8544261d-2187-42db-a0ce-11ff55d6bff7\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.335733 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8544261d-2187-42db-a0ce-11ff55d6bff7-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9\" (UID: \"8544261d-2187-42db-a0ce-11ff55d6bff7\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.335789 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8544261d-2187-42db-a0ce-11ff55d6bff7-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9\" (UID: \"8544261d-2187-42db-a0ce-11ff55d6bff7\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.436853 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8544261d-2187-42db-a0ce-11ff55d6bff7-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9\" (UID: \"8544261d-2187-42db-a0ce-11ff55d6bff7\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.436902 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8544261d-2187-42db-a0ce-11ff55d6bff7-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9\" (UID: \"8544261d-2187-42db-a0ce-11ff55d6bff7\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.436986 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zjnh\" (UniqueName: \"kubernetes.io/projected/8544261d-2187-42db-a0ce-11ff55d6bff7-kube-api-access-8zjnh\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9\" (UID: \"8544261d-2187-42db-a0ce-11ff55d6bff7\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.437339 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8544261d-2187-42db-a0ce-11ff55d6bff7-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9\" (UID: \"8544261d-2187-42db-a0ce-11ff55d6bff7\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.437591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8544261d-2187-42db-a0ce-11ff55d6bff7-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9\" (UID: \"8544261d-2187-42db-a0ce-11ff55d6bff7\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.456109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zjnh\" (UniqueName: \"kubernetes.io/projected/8544261d-2187-42db-a0ce-11ff55d6bff7-kube-api-access-8zjnh\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9\" (UID: \"8544261d-2187-42db-a0ce-11ff55d6bff7\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.509935 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:39 crc kubenswrapper[4749]: I1001 13:18:39.965793 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9"] Oct 01 13:18:40 crc kubenswrapper[4749]: I1001 13:18:40.186306 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" event={"ID":"8544261d-2187-42db-a0ce-11ff55d6bff7","Type":"ContainerStarted","Data":"2587694a3df20cad0660975c764f2092f1a02ad52c081f403805c7b987dbc110"} Oct 01 13:18:40 crc kubenswrapper[4749]: I1001 13:18:40.186721 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" event={"ID":"8544261d-2187-42db-a0ce-11ff55d6bff7","Type":"ContainerStarted","Data":"5208ebd9c5f157fca1a6941a9b77a0666a7d4396e40233392292831f99182566"} Oct 01 13:18:41 crc kubenswrapper[4749]: I1001 13:18:41.194073 4749 generic.go:334] "Generic (PLEG): container finished" podID="8544261d-2187-42db-a0ce-11ff55d6bff7" containerID="2587694a3df20cad0660975c764f2092f1a02ad52c081f403805c7b987dbc110" exitCode=0 Oct 01 13:18:41 crc kubenswrapper[4749]: I1001 13:18:41.194132 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" event={"ID":"8544261d-2187-42db-a0ce-11ff55d6bff7","Type":"ContainerDied","Data":"2587694a3df20cad0660975c764f2092f1a02ad52c081f403805c7b987dbc110"} Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.501842 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qsz45"] Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.502273 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qsz45" podUID="14505f73-fb06-4168-b5f6-b2ce51d65809" containerName="registry-server" containerID="cri-o://dc2954dadd92037029d74bf40bd8a12a04c66b9144ddf9919f1eac56f491ba15" gracePeriod=2 Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.703874 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6xfbz"] Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.704918 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.728003 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6xfbz"] Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.884981 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce431d3d-dfe9-4057-8416-d8f9af44237b-catalog-content\") pod \"redhat-operators-6xfbz\" (UID: \"ce431d3d-dfe9-4057-8416-d8f9af44237b\") " pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.885067 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce431d3d-dfe9-4057-8416-d8f9af44237b-utilities\") pod \"redhat-operators-6xfbz\" (UID: \"ce431d3d-dfe9-4057-8416-d8f9af44237b\") " pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.885133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nt6h\" (UniqueName: \"kubernetes.io/projected/ce431d3d-dfe9-4057-8416-d8f9af44237b-kube-api-access-7nt6h\") pod \"redhat-operators-6xfbz\" (UID: \"ce431d3d-dfe9-4057-8416-d8f9af44237b\") " pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.987279 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce431d3d-dfe9-4057-8416-d8f9af44237b-catalog-content\") pod \"redhat-operators-6xfbz\" (UID: \"ce431d3d-dfe9-4057-8416-d8f9af44237b\") " pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.987367 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce431d3d-dfe9-4057-8416-d8f9af44237b-utilities\") pod \"redhat-operators-6xfbz\" (UID: \"ce431d3d-dfe9-4057-8416-d8f9af44237b\") " pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.987422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nt6h\" (UniqueName: \"kubernetes.io/projected/ce431d3d-dfe9-4057-8416-d8f9af44237b-kube-api-access-7nt6h\") pod \"redhat-operators-6xfbz\" (UID: \"ce431d3d-dfe9-4057-8416-d8f9af44237b\") " pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.988012 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce431d3d-dfe9-4057-8416-d8f9af44237b-utilities\") pod \"redhat-operators-6xfbz\" (UID: \"ce431d3d-dfe9-4057-8416-d8f9af44237b\") " pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:42 crc kubenswrapper[4749]: I1001 13:18:42.988372 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce431d3d-dfe9-4057-8416-d8f9af44237b-catalog-content\") pod \"redhat-operators-6xfbz\" (UID: \"ce431d3d-dfe9-4057-8416-d8f9af44237b\") " pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:43 crc kubenswrapper[4749]: I1001 13:18:43.013921 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nt6h\" (UniqueName: \"kubernetes.io/projected/ce431d3d-dfe9-4057-8416-d8f9af44237b-kube-api-access-7nt6h\") pod \"redhat-operators-6xfbz\" (UID: \"ce431d3d-dfe9-4057-8416-d8f9af44237b\") " pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:43 crc kubenswrapper[4749]: I1001 13:18:43.026500 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:43 crc kubenswrapper[4749]: I1001 13:18:43.213658 4749 generic.go:334] "Generic (PLEG): container finished" podID="14505f73-fb06-4168-b5f6-b2ce51d65809" containerID="dc2954dadd92037029d74bf40bd8a12a04c66b9144ddf9919f1eac56f491ba15" exitCode=0 Oct 01 13:18:43 crc kubenswrapper[4749]: I1001 13:18:43.213763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qsz45" event={"ID":"14505f73-fb06-4168-b5f6-b2ce51d65809","Type":"ContainerDied","Data":"dc2954dadd92037029d74bf40bd8a12a04c66b9144ddf9919f1eac56f491ba15"} Oct 01 13:18:43 crc kubenswrapper[4749]: I1001 13:18:43.462818 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6xfbz"] Oct 01 13:18:43 crc kubenswrapper[4749]: W1001 13:18:43.467280 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce431d3d_dfe9_4057_8416_d8f9af44237b.slice/crio-66eb5f84e3d21c1227fd39ffe2eac5ccfe216de75f1cfaa9c4e99e131a91897a WatchSource:0}: Error finding container 66eb5f84e3d21c1227fd39ffe2eac5ccfe216de75f1cfaa9c4e99e131a91897a: Status 404 returned error can't find the container with id 66eb5f84e3d21c1227fd39ffe2eac5ccfe216de75f1cfaa9c4e99e131a91897a Oct 01 13:18:44 crc kubenswrapper[4749]: I1001 13:18:44.219853 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xfbz" event={"ID":"ce431d3d-dfe9-4057-8416-d8f9af44237b","Type":"ContainerStarted","Data":"66eb5f84e3d21c1227fd39ffe2eac5ccfe216de75f1cfaa9c4e99e131a91897a"} Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.166693 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.227412 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qsz45" Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.227444 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qsz45" event={"ID":"14505f73-fb06-4168-b5f6-b2ce51d65809","Type":"ContainerDied","Data":"f505476121b12e9d1344e8df4332394983bb68fd8763552654665ee3a8552657"} Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.227503 4749 scope.go:117] "RemoveContainer" containerID="dc2954dadd92037029d74bf40bd8a12a04c66b9144ddf9919f1eac56f491ba15" Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.228932 4749 generic.go:334] "Generic (PLEG): container finished" podID="ce431d3d-dfe9-4057-8416-d8f9af44237b" containerID="3ca5c10b48834b5ac43cc74fc282e43e85cb19fa35546f3e88fd084e01fd1c13" exitCode=0 Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.233210 4749 generic.go:334] "Generic (PLEG): container finished" podID="8544261d-2187-42db-a0ce-11ff55d6bff7" containerID="19d19c00812e97bc9f8c93fa97a7cf8f7e0f896817d2d200fc048e2ed4302501" exitCode=0 Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.240158 4749 scope.go:117] "RemoveContainer" containerID="271fdb3b42521443f9f5bc11cd2fbb0e4714c26d7c7bf920216badbde183b4e2" Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.251610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xfbz" event={"ID":"ce431d3d-dfe9-4057-8416-d8f9af44237b","Type":"ContainerDied","Data":"3ca5c10b48834b5ac43cc74fc282e43e85cb19fa35546f3e88fd084e01fd1c13"} Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.251656 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" event={"ID":"8544261d-2187-42db-a0ce-11ff55d6bff7","Type":"ContainerDied","Data":"19d19c00812e97bc9f8c93fa97a7cf8f7e0f896817d2d200fc048e2ed4302501"} Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.279617 4749 scope.go:117] "RemoveContainer" containerID="d6e89865055a95086884e1a36aa67aa66890bca86464ced2ea57736e65fc5993" Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.323119 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14505f73-fb06-4168-b5f6-b2ce51d65809-catalog-content\") pod \"14505f73-fb06-4168-b5f6-b2ce51d65809\" (UID: \"14505f73-fb06-4168-b5f6-b2ce51d65809\") " Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.323201 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14505f73-fb06-4168-b5f6-b2ce51d65809-utilities\") pod \"14505f73-fb06-4168-b5f6-b2ce51d65809\" (UID: \"14505f73-fb06-4168-b5f6-b2ce51d65809\") " Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.323277 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bh7d\" (UniqueName: \"kubernetes.io/projected/14505f73-fb06-4168-b5f6-b2ce51d65809-kube-api-access-9bh7d\") pod \"14505f73-fb06-4168-b5f6-b2ce51d65809\" (UID: \"14505f73-fb06-4168-b5f6-b2ce51d65809\") " Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.323931 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14505f73-fb06-4168-b5f6-b2ce51d65809-utilities" (OuterVolumeSpecName: "utilities") pod "14505f73-fb06-4168-b5f6-b2ce51d65809" (UID: "14505f73-fb06-4168-b5f6-b2ce51d65809"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.341158 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14505f73-fb06-4168-b5f6-b2ce51d65809-kube-api-access-9bh7d" (OuterVolumeSpecName: "kube-api-access-9bh7d") pod "14505f73-fb06-4168-b5f6-b2ce51d65809" (UID: "14505f73-fb06-4168-b5f6-b2ce51d65809"). InnerVolumeSpecName "kube-api-access-9bh7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.356376 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14505f73-fb06-4168-b5f6-b2ce51d65809-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14505f73-fb06-4168-b5f6-b2ce51d65809" (UID: "14505f73-fb06-4168-b5f6-b2ce51d65809"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.424722 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bh7d\" (UniqueName: \"kubernetes.io/projected/14505f73-fb06-4168-b5f6-b2ce51d65809-kube-api-access-9bh7d\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.424961 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14505f73-fb06-4168-b5f6-b2ce51d65809-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.425053 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14505f73-fb06-4168-b5f6-b2ce51d65809-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.565611 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qsz45"] Oct 01 13:18:45 crc kubenswrapper[4749]: I1001 13:18:45.571885 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qsz45"] Oct 01 13:18:46 crc kubenswrapper[4749]: I1001 13:18:46.248272 4749 generic.go:334] "Generic (PLEG): container finished" podID="8544261d-2187-42db-a0ce-11ff55d6bff7" containerID="2f2e5a4e0e702364c0b7a5085ac977510317d0210f1d19460d618d9bf618a865" exitCode=0 Oct 01 13:18:46 crc kubenswrapper[4749]: I1001 13:18:46.248372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" event={"ID":"8544261d-2187-42db-a0ce-11ff55d6bff7","Type":"ContainerDied","Data":"2f2e5a4e0e702364c0b7a5085ac977510317d0210f1d19460d618d9bf618a865"} Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.238859 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14505f73-fb06-4168-b5f6-b2ce51d65809" path="/var/lib/kubelet/pods/14505f73-fb06-4168-b5f6-b2ce51d65809/volumes" Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.259140 4749 generic.go:334] "Generic (PLEG): container finished" podID="ce431d3d-dfe9-4057-8416-d8f9af44237b" containerID="ded905abc7da1477080b6091f21987ba818a16fce462ea352f029c509718afa0" exitCode=0 Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.259305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xfbz" event={"ID":"ce431d3d-dfe9-4057-8416-d8f9af44237b","Type":"ContainerDied","Data":"ded905abc7da1477080b6091f21987ba818a16fce462ea352f029c509718afa0"} Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.650565 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.772913 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8544261d-2187-42db-a0ce-11ff55d6bff7-util\") pod \"8544261d-2187-42db-a0ce-11ff55d6bff7\" (UID: \"8544261d-2187-42db-a0ce-11ff55d6bff7\") " Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.773041 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zjnh\" (UniqueName: \"kubernetes.io/projected/8544261d-2187-42db-a0ce-11ff55d6bff7-kube-api-access-8zjnh\") pod \"8544261d-2187-42db-a0ce-11ff55d6bff7\" (UID: \"8544261d-2187-42db-a0ce-11ff55d6bff7\") " Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.773162 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8544261d-2187-42db-a0ce-11ff55d6bff7-bundle\") pod \"8544261d-2187-42db-a0ce-11ff55d6bff7\" (UID: \"8544261d-2187-42db-a0ce-11ff55d6bff7\") " Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.773977 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8544261d-2187-42db-a0ce-11ff55d6bff7-bundle" (OuterVolumeSpecName: "bundle") pod "8544261d-2187-42db-a0ce-11ff55d6bff7" (UID: "8544261d-2187-42db-a0ce-11ff55d6bff7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.778478 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8544261d-2187-42db-a0ce-11ff55d6bff7-kube-api-access-8zjnh" (OuterVolumeSpecName: "kube-api-access-8zjnh") pod "8544261d-2187-42db-a0ce-11ff55d6bff7" (UID: "8544261d-2187-42db-a0ce-11ff55d6bff7"). InnerVolumeSpecName "kube-api-access-8zjnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.794700 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8544261d-2187-42db-a0ce-11ff55d6bff7-util" (OuterVolumeSpecName: "util") pod "8544261d-2187-42db-a0ce-11ff55d6bff7" (UID: "8544261d-2187-42db-a0ce-11ff55d6bff7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.874815 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8544261d-2187-42db-a0ce-11ff55d6bff7-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.874875 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zjnh\" (UniqueName: \"kubernetes.io/projected/8544261d-2187-42db-a0ce-11ff55d6bff7-kube-api-access-8zjnh\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:47 crc kubenswrapper[4749]: I1001 13:18:47.874897 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8544261d-2187-42db-a0ce-11ff55d6bff7-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:18:48 crc kubenswrapper[4749]: I1001 13:18:48.267415 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xfbz" event={"ID":"ce431d3d-dfe9-4057-8416-d8f9af44237b","Type":"ContainerStarted","Data":"6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552"} Oct 01 13:18:48 crc kubenswrapper[4749]: I1001 13:18:48.269355 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" event={"ID":"8544261d-2187-42db-a0ce-11ff55d6bff7","Type":"ContainerDied","Data":"5208ebd9c5f157fca1a6941a9b77a0666a7d4396e40233392292831f99182566"} Oct 01 13:18:48 crc kubenswrapper[4749]: I1001 13:18:48.269400 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5208ebd9c5f157fca1a6941a9b77a0666a7d4396e40233392292831f99182566" Oct 01 13:18:48 crc kubenswrapper[4749]: I1001 13:18:48.269425 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9" Oct 01 13:18:48 crc kubenswrapper[4749]: I1001 13:18:48.292716 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6xfbz" podStartSLOduration=3.540309399 podStartE2EDuration="6.292691889s" podCreationTimestamp="2025-10-01 13:18:42 +0000 UTC" firstStartedPulling="2025-10-01 13:18:45.230136652 +0000 UTC m=+785.284121561" lastFinishedPulling="2025-10-01 13:18:47.982519112 +0000 UTC m=+788.036504051" observedRunningTime="2025-10-01 13:18:48.29169209 +0000 UTC m=+788.345676999" watchObservedRunningTime="2025-10-01 13:18:48.292691889 +0000 UTC m=+788.346676798" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.202657 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-c7stb"] Oct 01 13:18:50 crc kubenswrapper[4749]: E1001 13:18:50.203108 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8544261d-2187-42db-a0ce-11ff55d6bff7" containerName="util" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.203119 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8544261d-2187-42db-a0ce-11ff55d6bff7" containerName="util" Oct 01 13:18:50 crc kubenswrapper[4749]: E1001 13:18:50.203128 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8544261d-2187-42db-a0ce-11ff55d6bff7" containerName="extract" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.203134 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8544261d-2187-42db-a0ce-11ff55d6bff7" containerName="extract" Oct 01 13:18:50 crc kubenswrapper[4749]: E1001 13:18:50.203141 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8544261d-2187-42db-a0ce-11ff55d6bff7" containerName="pull" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.203147 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8544261d-2187-42db-a0ce-11ff55d6bff7" containerName="pull" Oct 01 13:18:50 crc kubenswrapper[4749]: E1001 13:18:50.203157 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14505f73-fb06-4168-b5f6-b2ce51d65809" containerName="extract-content" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.203163 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="14505f73-fb06-4168-b5f6-b2ce51d65809" containerName="extract-content" Oct 01 13:18:50 crc kubenswrapper[4749]: E1001 13:18:50.203174 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14505f73-fb06-4168-b5f6-b2ce51d65809" containerName="registry-server" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.203180 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="14505f73-fb06-4168-b5f6-b2ce51d65809" containerName="registry-server" Oct 01 13:18:50 crc kubenswrapper[4749]: E1001 13:18:50.203189 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14505f73-fb06-4168-b5f6-b2ce51d65809" containerName="extract-utilities" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.203195 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="14505f73-fb06-4168-b5f6-b2ce51d65809" containerName="extract-utilities" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.203317 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="14505f73-fb06-4168-b5f6-b2ce51d65809" containerName="registry-server" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.203331 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8544261d-2187-42db-a0ce-11ff55d6bff7" containerName="extract" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.203695 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-c7stb" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.205785 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.206086 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gz6ps" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.207952 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.213193 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdsgf\" (UniqueName: \"kubernetes.io/projected/a4382132-aa77-4918-8533-ea2d0cf18eba-kube-api-access-sdsgf\") pod \"nmstate-operator-5d6f6cfd66-c7stb\" (UID: \"a4382132-aa77-4918-8533-ea2d0cf18eba\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-c7stb" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.216468 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-c7stb"] Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.314650 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdsgf\" (UniqueName: \"kubernetes.io/projected/a4382132-aa77-4918-8533-ea2d0cf18eba-kube-api-access-sdsgf\") pod \"nmstate-operator-5d6f6cfd66-c7stb\" (UID: \"a4382132-aa77-4918-8533-ea2d0cf18eba\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-c7stb" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.331930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdsgf\" (UniqueName: \"kubernetes.io/projected/a4382132-aa77-4918-8533-ea2d0cf18eba-kube-api-access-sdsgf\") pod \"nmstate-operator-5d6f6cfd66-c7stb\" (UID: \"a4382132-aa77-4918-8533-ea2d0cf18eba\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-c7stb" Oct 01 13:18:50 crc kubenswrapper[4749]: I1001 13:18:50.519413 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-c7stb" Oct 01 13:18:51 crc kubenswrapper[4749]: I1001 13:18:51.051364 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-c7stb"] Oct 01 13:18:51 crc kubenswrapper[4749]: I1001 13:18:51.290807 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-c7stb" event={"ID":"a4382132-aa77-4918-8533-ea2d0cf18eba","Type":"ContainerStarted","Data":"7a09987d6930224c8895e67a71b8d7646f14620a0dccc137cd9be230b0067fc1"} Oct 01 13:18:53 crc kubenswrapper[4749]: I1001 13:18:53.027206 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:53 crc kubenswrapper[4749]: I1001 13:18:53.027604 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:18:54 crc kubenswrapper[4749]: I1001 13:18:54.093885 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6xfbz" podUID="ce431d3d-dfe9-4057-8416-d8f9af44237b" containerName="registry-server" probeResult="failure" output=< Oct 01 13:18:54 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Oct 01 13:18:54 crc kubenswrapper[4749]: > Oct 01 13:18:55 crc kubenswrapper[4749]: I1001 13:18:55.321146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-c7stb" event={"ID":"a4382132-aa77-4918-8533-ea2d0cf18eba","Type":"ContainerStarted","Data":"2a6685784e96d458a7e1481594f68dd2c35c2862db709dd06319ba5bd1ac0132"} Oct 01 13:18:55 crc kubenswrapper[4749]: I1001 13:18:55.347763 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-c7stb" podStartSLOduration=1.658544346 podStartE2EDuration="5.34774082s" podCreationTimestamp="2025-10-01 13:18:50 +0000 UTC" firstStartedPulling="2025-10-01 13:18:51.061567108 +0000 UTC m=+791.115552017" lastFinishedPulling="2025-10-01 13:18:54.750763562 +0000 UTC m=+794.804748491" observedRunningTime="2025-10-01 13:18:55.342818697 +0000 UTC m=+795.396803606" watchObservedRunningTime="2025-10-01 13:18:55.34774082 +0000 UTC m=+795.401725749" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.470283 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-5nhb7"] Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.472090 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-5nhb7" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.479264 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-p48xc" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.479382 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4"] Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.480629 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.482414 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.495139 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-5nhb7"] Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.504005 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4"] Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.532124 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-vd27x"] Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.541122 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.562103 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/19901e16-c93e-4806-a467-7af1e9ad9405-nmstate-lock\") pod \"nmstate-handler-vd27x\" (UID: \"19901e16-c93e-4806-a467-7af1e9ad9405\") " pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.562174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2237dcc7-ae68-4298-87d0-44d81d96b3c5-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-tk6l4\" (UID: \"2237dcc7-ae68-4298-87d0-44d81d96b3c5\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.562249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxq4g\" (UniqueName: \"kubernetes.io/projected/19901e16-c93e-4806-a467-7af1e9ad9405-kube-api-access-jxq4g\") pod \"nmstate-handler-vd27x\" (UID: \"19901e16-c93e-4806-a467-7af1e9ad9405\") " pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.562386 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/19901e16-c93e-4806-a467-7af1e9ad9405-ovs-socket\") pod \"nmstate-handler-vd27x\" (UID: \"19901e16-c93e-4806-a467-7af1e9ad9405\") " pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.562447 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn4gn\" (UniqueName: \"kubernetes.io/projected/48ae603d-54fb-4c62-8c02-1e9d6034ca81-kube-api-access-pn4gn\") pod \"nmstate-metrics-58fcddf996-5nhb7\" (UID: \"48ae603d-54fb-4c62-8c02-1e9d6034ca81\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-5nhb7" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.562508 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/19901e16-c93e-4806-a467-7af1e9ad9405-dbus-socket\") pod \"nmstate-handler-vd27x\" (UID: \"19901e16-c93e-4806-a467-7af1e9ad9405\") " pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.562531 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fnxs\" (UniqueName: \"kubernetes.io/projected/2237dcc7-ae68-4298-87d0-44d81d96b3c5-kube-api-access-6fnxs\") pod \"nmstate-webhook-6d689559c5-tk6l4\" (UID: \"2237dcc7-ae68-4298-87d0-44d81d96b3c5\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.601233 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9"] Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.601984 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.605611 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.609168 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.613364 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rcpht" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.619944 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9"] Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.663969 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fnxs\" (UniqueName: \"kubernetes.io/projected/2237dcc7-ae68-4298-87d0-44d81d96b3c5-kube-api-access-6fnxs\") pod \"nmstate-webhook-6d689559c5-tk6l4\" (UID: \"2237dcc7-ae68-4298-87d0-44d81d96b3c5\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.664006 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/19901e16-c93e-4806-a467-7af1e9ad9405-dbus-socket\") pod \"nmstate-handler-vd27x\" (UID: \"19901e16-c93e-4806-a467-7af1e9ad9405\") " pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.664043 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf5a15f7-d043-4b90-828f-584b833d38e5-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-64bz9\" (UID: \"cf5a15f7-d043-4b90-828f-584b833d38e5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.664078 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mgf8\" (UniqueName: \"kubernetes.io/projected/cf5a15f7-d043-4b90-828f-584b833d38e5-kube-api-access-8mgf8\") pod \"nmstate-console-plugin-864bb6dfb5-64bz9\" (UID: \"cf5a15f7-d043-4b90-828f-584b833d38e5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.664113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/19901e16-c93e-4806-a467-7af1e9ad9405-nmstate-lock\") pod \"nmstate-handler-vd27x\" (UID: \"19901e16-c93e-4806-a467-7af1e9ad9405\") " pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.664152 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2237dcc7-ae68-4298-87d0-44d81d96b3c5-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-tk6l4\" (UID: \"2237dcc7-ae68-4298-87d0-44d81d96b3c5\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.664279 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/19901e16-c93e-4806-a467-7af1e9ad9405-nmstate-lock\") pod \"nmstate-handler-vd27x\" (UID: \"19901e16-c93e-4806-a467-7af1e9ad9405\") " pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: E1001 13:19:00.664288 4749 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.664346 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cf5a15f7-d043-4b90-828f-584b833d38e5-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-64bz9\" (UID: \"cf5a15f7-d043-4b90-828f-584b833d38e5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" Oct 01 13:19:00 crc kubenswrapper[4749]: E1001 13:19:00.664389 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2237dcc7-ae68-4298-87d0-44d81d96b3c5-tls-key-pair podName:2237dcc7-ae68-4298-87d0-44d81d96b3c5 nodeName:}" failed. No retries permitted until 2025-10-01 13:19:01.164370307 +0000 UTC m=+801.218355206 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/2237dcc7-ae68-4298-87d0-44d81d96b3c5-tls-key-pair") pod "nmstate-webhook-6d689559c5-tk6l4" (UID: "2237dcc7-ae68-4298-87d0-44d81d96b3c5") : secret "openshift-nmstate-webhook" not found Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.664400 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/19901e16-c93e-4806-a467-7af1e9ad9405-dbus-socket\") pod \"nmstate-handler-vd27x\" (UID: \"19901e16-c93e-4806-a467-7af1e9ad9405\") " pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.664598 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxq4g\" (UniqueName: \"kubernetes.io/projected/19901e16-c93e-4806-a467-7af1e9ad9405-kube-api-access-jxq4g\") pod \"nmstate-handler-vd27x\" (UID: \"19901e16-c93e-4806-a467-7af1e9ad9405\") " pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.664647 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/19901e16-c93e-4806-a467-7af1e9ad9405-ovs-socket\") pod \"nmstate-handler-vd27x\" (UID: \"19901e16-c93e-4806-a467-7af1e9ad9405\") " pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.664709 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn4gn\" (UniqueName: \"kubernetes.io/projected/48ae603d-54fb-4c62-8c02-1e9d6034ca81-kube-api-access-pn4gn\") pod \"nmstate-metrics-58fcddf996-5nhb7\" (UID: \"48ae603d-54fb-4c62-8c02-1e9d6034ca81\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-5nhb7" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.664722 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/19901e16-c93e-4806-a467-7af1e9ad9405-ovs-socket\") pod \"nmstate-handler-vd27x\" (UID: \"19901e16-c93e-4806-a467-7af1e9ad9405\") " pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.681165 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fnxs\" (UniqueName: \"kubernetes.io/projected/2237dcc7-ae68-4298-87d0-44d81d96b3c5-kube-api-access-6fnxs\") pod \"nmstate-webhook-6d689559c5-tk6l4\" (UID: \"2237dcc7-ae68-4298-87d0-44d81d96b3c5\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.682153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn4gn\" (UniqueName: \"kubernetes.io/projected/48ae603d-54fb-4c62-8c02-1e9d6034ca81-kube-api-access-pn4gn\") pod \"nmstate-metrics-58fcddf996-5nhb7\" (UID: \"48ae603d-54fb-4c62-8c02-1e9d6034ca81\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-5nhb7" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.686950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxq4g\" (UniqueName: \"kubernetes.io/projected/19901e16-c93e-4806-a467-7af1e9ad9405-kube-api-access-jxq4g\") pod \"nmstate-handler-vd27x\" (UID: \"19901e16-c93e-4806-a467-7af1e9ad9405\") " pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.766392 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf5a15f7-d043-4b90-828f-584b833d38e5-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-64bz9\" (UID: \"cf5a15f7-d043-4b90-828f-584b833d38e5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.766472 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mgf8\" (UniqueName: \"kubernetes.io/projected/cf5a15f7-d043-4b90-828f-584b833d38e5-kube-api-access-8mgf8\") pod \"nmstate-console-plugin-864bb6dfb5-64bz9\" (UID: \"cf5a15f7-d043-4b90-828f-584b833d38e5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.766536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cf5a15f7-d043-4b90-828f-584b833d38e5-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-64bz9\" (UID: \"cf5a15f7-d043-4b90-828f-584b833d38e5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.767555 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cf5a15f7-d043-4b90-828f-584b833d38e5-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-64bz9\" (UID: \"cf5a15f7-d043-4b90-828f-584b833d38e5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.769481 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf5a15f7-d043-4b90-828f-584b833d38e5-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-64bz9\" (UID: \"cf5a15f7-d043-4b90-828f-584b833d38e5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.783027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mgf8\" (UniqueName: \"kubernetes.io/projected/cf5a15f7-d043-4b90-828f-584b833d38e5-kube-api-access-8mgf8\") pod \"nmstate-console-plugin-864bb6dfb5-64bz9\" (UID: \"cf5a15f7-d043-4b90-828f-584b833d38e5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.799082 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-5nhb7" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.810933 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5687495646-hrvpz"] Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.811589 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.820104 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5687495646-hrvpz"] Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.856633 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.867458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-console-oauth-config\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.867515 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-trusted-ca-bundle\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.867535 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwlrk\" (UniqueName: \"kubernetes.io/projected/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-kube-api-access-zwlrk\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.867576 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-service-ca\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.867598 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-console-serving-cert\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.867627 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-console-config\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.867656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-oauth-serving-cert\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: W1001 13:19:00.875198 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19901e16_c93e_4806_a467_7af1e9ad9405.slice/crio-f1d605cd0a551d9ae9c7b547914c701b8479a5845c21619505cd23e77c31052d WatchSource:0}: Error finding container f1d605cd0a551d9ae9c7b547914c701b8479a5845c21619505cd23e77c31052d: Status 404 returned error can't find the container with id f1d605cd0a551d9ae9c7b547914c701b8479a5845c21619505cd23e77c31052d Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.916850 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.969034 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-console-oauth-config\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.969067 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-trusted-ca-bundle\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.969086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwlrk\" (UniqueName: \"kubernetes.io/projected/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-kube-api-access-zwlrk\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.969128 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-service-ca\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.969167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-console-serving-cert\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.969198 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-console-config\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.969240 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-oauth-serving-cert\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.970139 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-oauth-serving-cert\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.970791 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-service-ca\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.970809 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-trusted-ca-bundle\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.971331 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-console-config\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.974807 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-console-oauth-config\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.975025 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-console-serving-cert\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:00 crc kubenswrapper[4749]: I1001 13:19:00.989359 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwlrk\" (UniqueName: \"kubernetes.io/projected/fe369d5e-92ba-4907-b4e9-b56bcaea3ad5-kube-api-access-zwlrk\") pod \"console-5687495646-hrvpz\" (UID: \"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5\") " pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:01 crc kubenswrapper[4749]: I1001 13:19:01.155261 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:01 crc kubenswrapper[4749]: I1001 13:19:01.172439 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2237dcc7-ae68-4298-87d0-44d81d96b3c5-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-tk6l4\" (UID: \"2237dcc7-ae68-4298-87d0-44d81d96b3c5\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" Oct 01 13:19:01 crc kubenswrapper[4749]: I1001 13:19:01.177279 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2237dcc7-ae68-4298-87d0-44d81d96b3c5-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-tk6l4\" (UID: \"2237dcc7-ae68-4298-87d0-44d81d96b3c5\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" Oct 01 13:19:01 crc kubenswrapper[4749]: I1001 13:19:01.198028 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-5nhb7"] Oct 01 13:19:01 crc kubenswrapper[4749]: W1001 13:19:01.201451 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48ae603d_54fb_4c62_8c02_1e9d6034ca81.slice/crio-0208228d0a3913a5f5323baa3f0eb6108b87d5ab17bf06ebdd87c30ae0684d5c WatchSource:0}: Error finding container 0208228d0a3913a5f5323baa3f0eb6108b87d5ab17bf06ebdd87c30ae0684d5c: Status 404 returned error can't find the container with id 0208228d0a3913a5f5323baa3f0eb6108b87d5ab17bf06ebdd87c30ae0684d5c Oct 01 13:19:01 crc kubenswrapper[4749]: I1001 13:19:01.336280 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9"] Oct 01 13:19:01 crc kubenswrapper[4749]: W1001 13:19:01.339342 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf5a15f7_d043_4b90_828f_584b833d38e5.slice/crio-90ce79d7ed18e5afccabee8f4ddb1575b9de4dccf5c794cbdce11655a5c4527a WatchSource:0}: Error finding container 90ce79d7ed18e5afccabee8f4ddb1575b9de4dccf5c794cbdce11655a5c4527a: Status 404 returned error can't find the container with id 90ce79d7ed18e5afccabee8f4ddb1575b9de4dccf5c794cbdce11655a5c4527a Oct 01 13:19:01 crc kubenswrapper[4749]: I1001 13:19:01.362520 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-5nhb7" event={"ID":"48ae603d-54fb-4c62-8c02-1e9d6034ca81","Type":"ContainerStarted","Data":"0208228d0a3913a5f5323baa3f0eb6108b87d5ab17bf06ebdd87c30ae0684d5c"} Oct 01 13:19:01 crc kubenswrapper[4749]: I1001 13:19:01.364108 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" event={"ID":"cf5a15f7-d043-4b90-828f-584b833d38e5","Type":"ContainerStarted","Data":"90ce79d7ed18e5afccabee8f4ddb1575b9de4dccf5c794cbdce11655a5c4527a"} Oct 01 13:19:01 crc kubenswrapper[4749]: I1001 13:19:01.367842 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vd27x" event={"ID":"19901e16-c93e-4806-a467-7af1e9ad9405","Type":"ContainerStarted","Data":"f1d605cd0a551d9ae9c7b547914c701b8479a5845c21619505cd23e77c31052d"} Oct 01 13:19:01 crc kubenswrapper[4749]: I1001 13:19:01.428054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" Oct 01 13:19:01 crc kubenswrapper[4749]: I1001 13:19:01.638428 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5687495646-hrvpz"] Oct 01 13:19:01 crc kubenswrapper[4749]: I1001 13:19:01.915449 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4"] Oct 01 13:19:02 crc kubenswrapper[4749]: I1001 13:19:02.374277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" event={"ID":"2237dcc7-ae68-4298-87d0-44d81d96b3c5","Type":"ContainerStarted","Data":"bd163ad127face25eda7b1514262940a8fbdcdb936ec7025175c9f2329590014"} Oct 01 13:19:02 crc kubenswrapper[4749]: I1001 13:19:02.376687 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5687495646-hrvpz" event={"ID":"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5","Type":"ContainerStarted","Data":"607cc4111afcb8e4fec9a6db5c7b9135f3259bdf20f8129999ede0a88eadd77d"} Oct 01 13:19:02 crc kubenswrapper[4749]: I1001 13:19:02.376712 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5687495646-hrvpz" event={"ID":"fe369d5e-92ba-4907-b4e9-b56bcaea3ad5","Type":"ContainerStarted","Data":"55acaf2b0bfc78d5825a52e1abc994be9cc85b6f46e17c06ee74611ffb79e8ba"} Oct 01 13:19:03 crc kubenswrapper[4749]: I1001 13:19:03.072212 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:19:03 crc kubenswrapper[4749]: I1001 13:19:03.090455 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5687495646-hrvpz" podStartSLOduration=3.090437171 podStartE2EDuration="3.090437171s" podCreationTimestamp="2025-10-01 13:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:19:02.406940794 +0000 UTC m=+802.460925733" watchObservedRunningTime="2025-10-01 13:19:03.090437171 +0000 UTC m=+803.144422090" Oct 01 13:19:03 crc kubenswrapper[4749]: I1001 13:19:03.108939 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:19:03 crc kubenswrapper[4749]: I1001 13:19:03.304501 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6xfbz"] Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.390150 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vd27x" event={"ID":"19901e16-c93e-4806-a467-7af1e9ad9405","Type":"ContainerStarted","Data":"3ffc76a5e88259f34f5a070f03575932bfbffc9acee59176ac4a16ca4a425465"} Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.390605 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.391749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" event={"ID":"2237dcc7-ae68-4298-87d0-44d81d96b3c5","Type":"ContainerStarted","Data":"c8884ad7e549f4c588ba187f0d5642bd01371427355b55a3a373687d960e7dfd"} Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.392288 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.393709 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-5nhb7" event={"ID":"48ae603d-54fb-4c62-8c02-1e9d6034ca81","Type":"ContainerStarted","Data":"76fbbcd95ba8a0e2d7d30763e6aa402a0fe23dafb594dd765f62c045dae7d055"} Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.393957 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6xfbz" podUID="ce431d3d-dfe9-4057-8416-d8f9af44237b" containerName="registry-server" containerID="cri-o://6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552" gracePeriod=2 Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.410808 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-vd27x" podStartSLOduration=1.818670939 podStartE2EDuration="4.410789831s" podCreationTimestamp="2025-10-01 13:19:00 +0000 UTC" firstStartedPulling="2025-10-01 13:19:00.877481528 +0000 UTC m=+800.931466427" lastFinishedPulling="2025-10-01 13:19:03.46960038 +0000 UTC m=+803.523585319" observedRunningTime="2025-10-01 13:19:04.409665709 +0000 UTC m=+804.463650618" watchObservedRunningTime="2025-10-01 13:19:04.410789831 +0000 UTC m=+804.464774730" Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.429685 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" podStartSLOduration=2.895225578 podStartE2EDuration="4.429665919s" podCreationTimestamp="2025-10-01 13:19:00 +0000 UTC" firstStartedPulling="2025-10-01 13:19:01.936884259 +0000 UTC m=+801.990869158" lastFinishedPulling="2025-10-01 13:19:03.4713246 +0000 UTC m=+803.525309499" observedRunningTime="2025-10-01 13:19:04.426305402 +0000 UTC m=+804.480290291" watchObservedRunningTime="2025-10-01 13:19:04.429665919 +0000 UTC m=+804.483650828" Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.784810 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.835868 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce431d3d-dfe9-4057-8416-d8f9af44237b-utilities\") pod \"ce431d3d-dfe9-4057-8416-d8f9af44237b\" (UID: \"ce431d3d-dfe9-4057-8416-d8f9af44237b\") " Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.835915 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce431d3d-dfe9-4057-8416-d8f9af44237b-catalog-content\") pod \"ce431d3d-dfe9-4057-8416-d8f9af44237b\" (UID: \"ce431d3d-dfe9-4057-8416-d8f9af44237b\") " Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.835980 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nt6h\" (UniqueName: \"kubernetes.io/projected/ce431d3d-dfe9-4057-8416-d8f9af44237b-kube-api-access-7nt6h\") pod \"ce431d3d-dfe9-4057-8416-d8f9af44237b\" (UID: \"ce431d3d-dfe9-4057-8416-d8f9af44237b\") " Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.837514 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce431d3d-dfe9-4057-8416-d8f9af44237b-utilities" (OuterVolumeSpecName: "utilities") pod "ce431d3d-dfe9-4057-8416-d8f9af44237b" (UID: "ce431d3d-dfe9-4057-8416-d8f9af44237b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.843411 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce431d3d-dfe9-4057-8416-d8f9af44237b-kube-api-access-7nt6h" (OuterVolumeSpecName: "kube-api-access-7nt6h") pod "ce431d3d-dfe9-4057-8416-d8f9af44237b" (UID: "ce431d3d-dfe9-4057-8416-d8f9af44237b"). InnerVolumeSpecName "kube-api-access-7nt6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.919949 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce431d3d-dfe9-4057-8416-d8f9af44237b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce431d3d-dfe9-4057-8416-d8f9af44237b" (UID: "ce431d3d-dfe9-4057-8416-d8f9af44237b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.937498 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce431d3d-dfe9-4057-8416-d8f9af44237b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.937537 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce431d3d-dfe9-4057-8416-d8f9af44237b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:04 crc kubenswrapper[4749]: I1001 13:19:04.937578 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nt6h\" (UniqueName: \"kubernetes.io/projected/ce431d3d-dfe9-4057-8416-d8f9af44237b-kube-api-access-7nt6h\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.402817 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" event={"ID":"cf5a15f7-d043-4b90-828f-584b833d38e5","Type":"ContainerStarted","Data":"a9e79c2fe2b1fda0fd3fb7dcb594daa559fb8b1ab72858ef8d01a9728e58c52f"} Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.406408 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6xfbz" Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.406363 4749 generic.go:334] "Generic (PLEG): container finished" podID="ce431d3d-dfe9-4057-8416-d8f9af44237b" containerID="6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552" exitCode=0 Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.406950 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xfbz" event={"ID":"ce431d3d-dfe9-4057-8416-d8f9af44237b","Type":"ContainerDied","Data":"6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552"} Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.407038 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6xfbz" event={"ID":"ce431d3d-dfe9-4057-8416-d8f9af44237b","Type":"ContainerDied","Data":"66eb5f84e3d21c1227fd39ffe2eac5ccfe216de75f1cfaa9c4e99e131a91897a"} Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.407116 4749 scope.go:117] "RemoveContainer" containerID="6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552" Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.423312 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-64bz9" podStartSLOduration=2.435155612 podStartE2EDuration="5.423296461s" podCreationTimestamp="2025-10-01 13:19:00 +0000 UTC" firstStartedPulling="2025-10-01 13:19:01.341748636 +0000 UTC m=+801.395733545" lastFinishedPulling="2025-10-01 13:19:04.329889495 +0000 UTC m=+804.383874394" observedRunningTime="2025-10-01 13:19:05.421794167 +0000 UTC m=+805.475779076" watchObservedRunningTime="2025-10-01 13:19:05.423296461 +0000 UTC m=+805.477281350" Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.435859 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6xfbz"] Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.445311 4749 scope.go:117] "RemoveContainer" containerID="ded905abc7da1477080b6091f21987ba818a16fce462ea352f029c509718afa0" Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.452746 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6xfbz"] Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.477403 4749 scope.go:117] "RemoveContainer" containerID="3ca5c10b48834b5ac43cc74fc282e43e85cb19fa35546f3e88fd084e01fd1c13" Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.504622 4749 scope.go:117] "RemoveContainer" containerID="6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552" Oct 01 13:19:05 crc kubenswrapper[4749]: E1001 13:19:05.505333 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552\": container with ID starting with 6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552 not found: ID does not exist" containerID="6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552" Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.505377 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552"} err="failed to get container status \"6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552\": rpc error: code = NotFound desc = could not find container \"6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552\": container with ID starting with 6bbe16589a033ef6126b4377962cf0f59bb71e69c44b3a7081bf2d69af678552 not found: ID does not exist" Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.505403 4749 scope.go:117] "RemoveContainer" containerID="ded905abc7da1477080b6091f21987ba818a16fce462ea352f029c509718afa0" Oct 01 13:19:05 crc kubenswrapper[4749]: E1001 13:19:05.505936 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ded905abc7da1477080b6091f21987ba818a16fce462ea352f029c509718afa0\": container with ID starting with ded905abc7da1477080b6091f21987ba818a16fce462ea352f029c509718afa0 not found: ID does not exist" containerID="ded905abc7da1477080b6091f21987ba818a16fce462ea352f029c509718afa0" Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.505976 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded905abc7da1477080b6091f21987ba818a16fce462ea352f029c509718afa0"} err="failed to get container status \"ded905abc7da1477080b6091f21987ba818a16fce462ea352f029c509718afa0\": rpc error: code = NotFound desc = could not find container \"ded905abc7da1477080b6091f21987ba818a16fce462ea352f029c509718afa0\": container with ID starting with ded905abc7da1477080b6091f21987ba818a16fce462ea352f029c509718afa0 not found: ID does not exist" Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.505991 4749 scope.go:117] "RemoveContainer" containerID="3ca5c10b48834b5ac43cc74fc282e43e85cb19fa35546f3e88fd084e01fd1c13" Oct 01 13:19:05 crc kubenswrapper[4749]: E1001 13:19:05.506468 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ca5c10b48834b5ac43cc74fc282e43e85cb19fa35546f3e88fd084e01fd1c13\": container with ID starting with 3ca5c10b48834b5ac43cc74fc282e43e85cb19fa35546f3e88fd084e01fd1c13 not found: ID does not exist" containerID="3ca5c10b48834b5ac43cc74fc282e43e85cb19fa35546f3e88fd084e01fd1c13" Oct 01 13:19:05 crc kubenswrapper[4749]: I1001 13:19:05.506497 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca5c10b48834b5ac43cc74fc282e43e85cb19fa35546f3e88fd084e01fd1c13"} err="failed to get container status \"3ca5c10b48834b5ac43cc74fc282e43e85cb19fa35546f3e88fd084e01fd1c13\": rpc error: code = NotFound desc = could not find container \"3ca5c10b48834b5ac43cc74fc282e43e85cb19fa35546f3e88fd084e01fd1c13\": container with ID starting with 3ca5c10b48834b5ac43cc74fc282e43e85cb19fa35546f3e88fd084e01fd1c13 not found: ID does not exist" Oct 01 13:19:07 crc kubenswrapper[4749]: I1001 13:19:07.238661 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce431d3d-dfe9-4057-8416-d8f9af44237b" path="/var/lib/kubelet/pods/ce431d3d-dfe9-4057-8416-d8f9af44237b/volumes" Oct 01 13:19:08 crc kubenswrapper[4749]: I1001 13:19:08.436036 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-5nhb7" event={"ID":"48ae603d-54fb-4c62-8c02-1e9d6034ca81","Type":"ContainerStarted","Data":"63104f5f403055644d1a6305786b9b6e4428622a214d25f341bbd292226169cb"} Oct 01 13:19:08 crc kubenswrapper[4749]: I1001 13:19:08.460953 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-5nhb7" podStartSLOduration=2.332764293 podStartE2EDuration="8.460918226s" podCreationTimestamp="2025-10-01 13:19:00 +0000 UTC" firstStartedPulling="2025-10-01 13:19:01.204819374 +0000 UTC m=+801.258804273" lastFinishedPulling="2025-10-01 13:19:07.332973317 +0000 UTC m=+807.386958206" observedRunningTime="2025-10-01 13:19:08.455058356 +0000 UTC m=+808.509043295" watchObservedRunningTime="2025-10-01 13:19:08.460918226 +0000 UTC m=+808.514903175" Oct 01 13:19:10 crc kubenswrapper[4749]: I1001 13:19:10.900852 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-vd27x" Oct 01 13:19:11 crc kubenswrapper[4749]: I1001 13:19:11.156128 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:11 crc kubenswrapper[4749]: I1001 13:19:11.171574 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:11 crc kubenswrapper[4749]: I1001 13:19:11.176711 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:11 crc kubenswrapper[4749]: I1001 13:19:11.466390 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5687495646-hrvpz" Oct 01 13:19:11 crc kubenswrapper[4749]: I1001 13:19:11.557488 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nsv4j"] Oct 01 13:19:12 crc kubenswrapper[4749]: I1001 13:19:12.987083 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s6tcd"] Oct 01 13:19:12 crc kubenswrapper[4749]: E1001 13:19:12.987573 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce431d3d-dfe9-4057-8416-d8f9af44237b" containerName="extract-content" Oct 01 13:19:12 crc kubenswrapper[4749]: I1001 13:19:12.987604 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce431d3d-dfe9-4057-8416-d8f9af44237b" containerName="extract-content" Oct 01 13:19:12 crc kubenswrapper[4749]: E1001 13:19:12.987652 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce431d3d-dfe9-4057-8416-d8f9af44237b" containerName="extract-utilities" Oct 01 13:19:12 crc kubenswrapper[4749]: I1001 13:19:12.987671 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce431d3d-dfe9-4057-8416-d8f9af44237b" containerName="extract-utilities" Oct 01 13:19:12 crc kubenswrapper[4749]: E1001 13:19:12.987702 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce431d3d-dfe9-4057-8416-d8f9af44237b" containerName="registry-server" Oct 01 13:19:12 crc kubenswrapper[4749]: I1001 13:19:12.987720 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce431d3d-dfe9-4057-8416-d8f9af44237b" containerName="registry-server" Oct 01 13:19:12 crc kubenswrapper[4749]: I1001 13:19:12.987977 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce431d3d-dfe9-4057-8416-d8f9af44237b" containerName="registry-server" Oct 01 13:19:12 crc kubenswrapper[4749]: I1001 13:19:12.990168 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:13 crc kubenswrapper[4749]: I1001 13:19:13.011372 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s6tcd"] Oct 01 13:19:13 crc kubenswrapper[4749]: I1001 13:19:13.067068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b125fd5a-942e-48e2-9d8b-b8051da0f56f-utilities\") pod \"certified-operators-s6tcd\" (UID: \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\") " pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:13 crc kubenswrapper[4749]: I1001 13:19:13.067318 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b125fd5a-942e-48e2-9d8b-b8051da0f56f-catalog-content\") pod \"certified-operators-s6tcd\" (UID: \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\") " pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:13 crc kubenswrapper[4749]: I1001 13:19:13.067383 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bj4k\" (UniqueName: \"kubernetes.io/projected/b125fd5a-942e-48e2-9d8b-b8051da0f56f-kube-api-access-2bj4k\") pod \"certified-operators-s6tcd\" (UID: \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\") " pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:13 crc kubenswrapper[4749]: I1001 13:19:13.169029 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b125fd5a-942e-48e2-9d8b-b8051da0f56f-catalog-content\") pod \"certified-operators-s6tcd\" (UID: \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\") " pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:13 crc kubenswrapper[4749]: I1001 13:19:13.169097 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bj4k\" (UniqueName: \"kubernetes.io/projected/b125fd5a-942e-48e2-9d8b-b8051da0f56f-kube-api-access-2bj4k\") pod \"certified-operators-s6tcd\" (UID: \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\") " pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:13 crc kubenswrapper[4749]: I1001 13:19:13.169158 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b125fd5a-942e-48e2-9d8b-b8051da0f56f-utilities\") pod \"certified-operators-s6tcd\" (UID: \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\") " pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:13 crc kubenswrapper[4749]: I1001 13:19:13.169622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b125fd5a-942e-48e2-9d8b-b8051da0f56f-catalog-content\") pod \"certified-operators-s6tcd\" (UID: \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\") " pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:13 crc kubenswrapper[4749]: I1001 13:19:13.169645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b125fd5a-942e-48e2-9d8b-b8051da0f56f-utilities\") pod \"certified-operators-s6tcd\" (UID: \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\") " pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:13 crc kubenswrapper[4749]: I1001 13:19:13.189065 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bj4k\" (UniqueName: \"kubernetes.io/projected/b125fd5a-942e-48e2-9d8b-b8051da0f56f-kube-api-access-2bj4k\") pod \"certified-operators-s6tcd\" (UID: \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\") " pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:13 crc kubenswrapper[4749]: I1001 13:19:13.320309 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:14 crc kubenswrapper[4749]: I1001 13:19:13.612608 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s6tcd"] Oct 01 13:19:14 crc kubenswrapper[4749]: I1001 13:19:14.492544 4749 generic.go:334] "Generic (PLEG): container finished" podID="b125fd5a-942e-48e2-9d8b-b8051da0f56f" containerID="83e3717d664fda95d338dadba11fddfc8bc4da5d5bb9f6b66d498680c438f099" exitCode=0 Oct 01 13:19:14 crc kubenswrapper[4749]: I1001 13:19:14.492919 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6tcd" event={"ID":"b125fd5a-942e-48e2-9d8b-b8051da0f56f","Type":"ContainerDied","Data":"83e3717d664fda95d338dadba11fddfc8bc4da5d5bb9f6b66d498680c438f099"} Oct 01 13:19:14 crc kubenswrapper[4749]: I1001 13:19:14.493344 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6tcd" event={"ID":"b125fd5a-942e-48e2-9d8b-b8051da0f56f","Type":"ContainerStarted","Data":"43647692cf9f8582a9f919fd1a1c4e2697fbf1acf959a8e86d228071318d0789"} Oct 01 13:19:15 crc kubenswrapper[4749]: I1001 13:19:15.502680 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6tcd" event={"ID":"b125fd5a-942e-48e2-9d8b-b8051da0f56f","Type":"ContainerStarted","Data":"f56522805aae1f6f0c0d0fd9e54a94c20f8375035f12eeeff71b10292f41c662"} Oct 01 13:19:16 crc kubenswrapper[4749]: I1001 13:19:16.511425 4749 generic.go:334] "Generic (PLEG): container finished" podID="b125fd5a-942e-48e2-9d8b-b8051da0f56f" containerID="f56522805aae1f6f0c0d0fd9e54a94c20f8375035f12eeeff71b10292f41c662" exitCode=0 Oct 01 13:19:16 crc kubenswrapper[4749]: I1001 13:19:16.511516 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6tcd" event={"ID":"b125fd5a-942e-48e2-9d8b-b8051da0f56f","Type":"ContainerDied","Data":"f56522805aae1f6f0c0d0fd9e54a94c20f8375035f12eeeff71b10292f41c662"} Oct 01 13:19:17 crc kubenswrapper[4749]: I1001 13:19:17.519988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6tcd" event={"ID":"b125fd5a-942e-48e2-9d8b-b8051da0f56f","Type":"ContainerStarted","Data":"38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5"} Oct 01 13:19:17 crc kubenswrapper[4749]: I1001 13:19:17.542278 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s6tcd" podStartSLOduration=3.066041596 podStartE2EDuration="5.542224713s" podCreationTimestamp="2025-10-01 13:19:12 +0000 UTC" firstStartedPulling="2025-10-01 13:19:14.495716372 +0000 UTC m=+814.549701311" lastFinishedPulling="2025-10-01 13:19:16.971899489 +0000 UTC m=+817.025884428" observedRunningTime="2025-10-01 13:19:17.539535115 +0000 UTC m=+817.593520024" watchObservedRunningTime="2025-10-01 13:19:17.542224713 +0000 UTC m=+817.596209652" Oct 01 13:19:21 crc kubenswrapper[4749]: I1001 13:19:21.439395 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-tk6l4" Oct 01 13:19:23 crc kubenswrapper[4749]: I1001 13:19:23.321006 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:23 crc kubenswrapper[4749]: I1001 13:19:23.321203 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:23 crc kubenswrapper[4749]: I1001 13:19:23.386697 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:23 crc kubenswrapper[4749]: I1001 13:19:23.639924 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:23 crc kubenswrapper[4749]: I1001 13:19:23.706137 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s6tcd"] Oct 01 13:19:25 crc kubenswrapper[4749]: I1001 13:19:25.594364 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s6tcd" podUID="b125fd5a-942e-48e2-9d8b-b8051da0f56f" containerName="registry-server" containerID="cri-o://38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5" gracePeriod=2 Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.073100 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.172579 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bj4k\" (UniqueName: \"kubernetes.io/projected/b125fd5a-942e-48e2-9d8b-b8051da0f56f-kube-api-access-2bj4k\") pod \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\" (UID: \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\") " Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.172741 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b125fd5a-942e-48e2-9d8b-b8051da0f56f-catalog-content\") pod \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\" (UID: \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\") " Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.172795 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b125fd5a-942e-48e2-9d8b-b8051da0f56f-utilities\") pod \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\" (UID: \"b125fd5a-942e-48e2-9d8b-b8051da0f56f\") " Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.173714 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b125fd5a-942e-48e2-9d8b-b8051da0f56f-utilities" (OuterVolumeSpecName: "utilities") pod "b125fd5a-942e-48e2-9d8b-b8051da0f56f" (UID: "b125fd5a-942e-48e2-9d8b-b8051da0f56f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.180992 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b125fd5a-942e-48e2-9d8b-b8051da0f56f-kube-api-access-2bj4k" (OuterVolumeSpecName: "kube-api-access-2bj4k") pod "b125fd5a-942e-48e2-9d8b-b8051da0f56f" (UID: "b125fd5a-942e-48e2-9d8b-b8051da0f56f"). InnerVolumeSpecName "kube-api-access-2bj4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.241023 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b125fd5a-942e-48e2-9d8b-b8051da0f56f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b125fd5a-942e-48e2-9d8b-b8051da0f56f" (UID: "b125fd5a-942e-48e2-9d8b-b8051da0f56f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.274621 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bj4k\" (UniqueName: \"kubernetes.io/projected/b125fd5a-942e-48e2-9d8b-b8051da0f56f-kube-api-access-2bj4k\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.274660 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b125fd5a-942e-48e2-9d8b-b8051da0f56f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.274673 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b125fd5a-942e-48e2-9d8b-b8051da0f56f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.605422 4749 generic.go:334] "Generic (PLEG): container finished" podID="b125fd5a-942e-48e2-9d8b-b8051da0f56f" containerID="38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5" exitCode=0 Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.605488 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s6tcd" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.605492 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6tcd" event={"ID":"b125fd5a-942e-48e2-9d8b-b8051da0f56f","Type":"ContainerDied","Data":"38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5"} Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.605587 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s6tcd" event={"ID":"b125fd5a-942e-48e2-9d8b-b8051da0f56f","Type":"ContainerDied","Data":"43647692cf9f8582a9f919fd1a1c4e2697fbf1acf959a8e86d228071318d0789"} Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.605620 4749 scope.go:117] "RemoveContainer" containerID="38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.635129 4749 scope.go:117] "RemoveContainer" containerID="f56522805aae1f6f0c0d0fd9e54a94c20f8375035f12eeeff71b10292f41c662" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.641724 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s6tcd"] Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.648255 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s6tcd"] Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.664112 4749 scope.go:117] "RemoveContainer" containerID="83e3717d664fda95d338dadba11fddfc8bc4da5d5bb9f6b66d498680c438f099" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.684754 4749 scope.go:117] "RemoveContainer" containerID="38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5" Oct 01 13:19:26 crc kubenswrapper[4749]: E1001 13:19:26.685495 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5\": container with ID starting with 38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5 not found: ID does not exist" containerID="38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.685571 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5"} err="failed to get container status \"38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5\": rpc error: code = NotFound desc = could not find container \"38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5\": container with ID starting with 38b9c2cf58f545071e217d8fa68355bfa67881707dea6bd80c5a560bb15d0fb5 not found: ID does not exist" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.685611 4749 scope.go:117] "RemoveContainer" containerID="f56522805aae1f6f0c0d0fd9e54a94c20f8375035f12eeeff71b10292f41c662" Oct 01 13:19:26 crc kubenswrapper[4749]: E1001 13:19:26.686100 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f56522805aae1f6f0c0d0fd9e54a94c20f8375035f12eeeff71b10292f41c662\": container with ID starting with f56522805aae1f6f0c0d0fd9e54a94c20f8375035f12eeeff71b10292f41c662 not found: ID does not exist" containerID="f56522805aae1f6f0c0d0fd9e54a94c20f8375035f12eeeff71b10292f41c662" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.686154 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f56522805aae1f6f0c0d0fd9e54a94c20f8375035f12eeeff71b10292f41c662"} err="failed to get container status \"f56522805aae1f6f0c0d0fd9e54a94c20f8375035f12eeeff71b10292f41c662\": rpc error: code = NotFound desc = could not find container \"f56522805aae1f6f0c0d0fd9e54a94c20f8375035f12eeeff71b10292f41c662\": container with ID starting with f56522805aae1f6f0c0d0fd9e54a94c20f8375035f12eeeff71b10292f41c662 not found: ID does not exist" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.686189 4749 scope.go:117] "RemoveContainer" containerID="83e3717d664fda95d338dadba11fddfc8bc4da5d5bb9f6b66d498680c438f099" Oct 01 13:19:26 crc kubenswrapper[4749]: E1001 13:19:26.686807 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83e3717d664fda95d338dadba11fddfc8bc4da5d5bb9f6b66d498680c438f099\": container with ID starting with 83e3717d664fda95d338dadba11fddfc8bc4da5d5bb9f6b66d498680c438f099 not found: ID does not exist" containerID="83e3717d664fda95d338dadba11fddfc8bc4da5d5bb9f6b66d498680c438f099" Oct 01 13:19:26 crc kubenswrapper[4749]: I1001 13:19:26.686848 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83e3717d664fda95d338dadba11fddfc8bc4da5d5bb9f6b66d498680c438f099"} err="failed to get container status \"83e3717d664fda95d338dadba11fddfc8bc4da5d5bb9f6b66d498680c438f099\": rpc error: code = NotFound desc = could not find container \"83e3717d664fda95d338dadba11fddfc8bc4da5d5bb9f6b66d498680c438f099\": container with ID starting with 83e3717d664fda95d338dadba11fddfc8bc4da5d5bb9f6b66d498680c438f099 not found: ID does not exist" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.261709 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b125fd5a-942e-48e2-9d8b-b8051da0f56f" path="/var/lib/kubelet/pods/b125fd5a-942e-48e2-9d8b-b8051da0f56f/volumes" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.262931 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-29bwh"] Oct 01 13:19:27 crc kubenswrapper[4749]: E1001 13:19:27.263399 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b125fd5a-942e-48e2-9d8b-b8051da0f56f" containerName="registry-server" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.263421 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b125fd5a-942e-48e2-9d8b-b8051da0f56f" containerName="registry-server" Oct 01 13:19:27 crc kubenswrapper[4749]: E1001 13:19:27.263438 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b125fd5a-942e-48e2-9d8b-b8051da0f56f" containerName="extract-utilities" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.263447 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b125fd5a-942e-48e2-9d8b-b8051da0f56f" containerName="extract-utilities" Oct 01 13:19:27 crc kubenswrapper[4749]: E1001 13:19:27.263483 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b125fd5a-942e-48e2-9d8b-b8051da0f56f" containerName="extract-content" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.263492 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b125fd5a-942e-48e2-9d8b-b8051da0f56f" containerName="extract-content" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.263906 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b125fd5a-942e-48e2-9d8b-b8051da0f56f" containerName="registry-server" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.270196 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.284163 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-29bwh"] Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.287656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9955d33-193a-47fc-905b-bfc91880f5af-catalog-content\") pod \"community-operators-29bwh\" (UID: \"a9955d33-193a-47fc-905b-bfc91880f5af\") " pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.287741 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9955d33-193a-47fc-905b-bfc91880f5af-utilities\") pod \"community-operators-29bwh\" (UID: \"a9955d33-193a-47fc-905b-bfc91880f5af\") " pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.287779 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x49b8\" (UniqueName: \"kubernetes.io/projected/a9955d33-193a-47fc-905b-bfc91880f5af-kube-api-access-x49b8\") pod \"community-operators-29bwh\" (UID: \"a9955d33-193a-47fc-905b-bfc91880f5af\") " pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.389928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9955d33-193a-47fc-905b-bfc91880f5af-catalog-content\") pod \"community-operators-29bwh\" (UID: \"a9955d33-193a-47fc-905b-bfc91880f5af\") " pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.390024 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9955d33-193a-47fc-905b-bfc91880f5af-utilities\") pod \"community-operators-29bwh\" (UID: \"a9955d33-193a-47fc-905b-bfc91880f5af\") " pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.390048 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x49b8\" (UniqueName: \"kubernetes.io/projected/a9955d33-193a-47fc-905b-bfc91880f5af-kube-api-access-x49b8\") pod \"community-operators-29bwh\" (UID: \"a9955d33-193a-47fc-905b-bfc91880f5af\") " pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.390634 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9955d33-193a-47fc-905b-bfc91880f5af-catalog-content\") pod \"community-operators-29bwh\" (UID: \"a9955d33-193a-47fc-905b-bfc91880f5af\") " pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.390659 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9955d33-193a-47fc-905b-bfc91880f5af-utilities\") pod \"community-operators-29bwh\" (UID: \"a9955d33-193a-47fc-905b-bfc91880f5af\") " pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.411128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x49b8\" (UniqueName: \"kubernetes.io/projected/a9955d33-193a-47fc-905b-bfc91880f5af-kube-api-access-x49b8\") pod \"community-operators-29bwh\" (UID: \"a9955d33-193a-47fc-905b-bfc91880f5af\") " pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.589560 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:27 crc kubenswrapper[4749]: I1001 13:19:27.804567 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-29bwh"] Oct 01 13:19:28 crc kubenswrapper[4749]: I1001 13:19:28.625735 4749 generic.go:334] "Generic (PLEG): container finished" podID="a9955d33-193a-47fc-905b-bfc91880f5af" containerID="a9395426aaed410e9f90ac32c23655d0059bd2ab591f2cafef84d6d52f15554f" exitCode=0 Oct 01 13:19:28 crc kubenswrapper[4749]: I1001 13:19:28.625792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bwh" event={"ID":"a9955d33-193a-47fc-905b-bfc91880f5af","Type":"ContainerDied","Data":"a9395426aaed410e9f90ac32c23655d0059bd2ab591f2cafef84d6d52f15554f"} Oct 01 13:19:28 crc kubenswrapper[4749]: I1001 13:19:28.625822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bwh" event={"ID":"a9955d33-193a-47fc-905b-bfc91880f5af","Type":"ContainerStarted","Data":"971c01da808826f896ad438844594be3e4a274a7ae4d0ac92cec30afc620de07"} Oct 01 13:19:29 crc kubenswrapper[4749]: I1001 13:19:29.636611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bwh" event={"ID":"a9955d33-193a-47fc-905b-bfc91880f5af","Type":"ContainerStarted","Data":"8e7524f2dc60cf456fa223545350cae2139c1c0d08ff91fb819d054ef5cadb9f"} Oct 01 13:19:30 crc kubenswrapper[4749]: I1001 13:19:30.647407 4749 generic.go:334] "Generic (PLEG): container finished" podID="a9955d33-193a-47fc-905b-bfc91880f5af" containerID="8e7524f2dc60cf456fa223545350cae2139c1c0d08ff91fb819d054ef5cadb9f" exitCode=0 Oct 01 13:19:30 crc kubenswrapper[4749]: I1001 13:19:30.647520 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bwh" event={"ID":"a9955d33-193a-47fc-905b-bfc91880f5af","Type":"ContainerDied","Data":"8e7524f2dc60cf456fa223545350cae2139c1c0d08ff91fb819d054ef5cadb9f"} Oct 01 13:19:31 crc kubenswrapper[4749]: I1001 13:19:31.658464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bwh" event={"ID":"a9955d33-193a-47fc-905b-bfc91880f5af","Type":"ContainerStarted","Data":"365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770"} Oct 01 13:19:31 crc kubenswrapper[4749]: I1001 13:19:31.695516 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-29bwh" podStartSLOduration=2.207326381 podStartE2EDuration="4.695487236s" podCreationTimestamp="2025-10-01 13:19:27 +0000 UTC" firstStartedPulling="2025-10-01 13:19:28.629335144 +0000 UTC m=+828.683320073" lastFinishedPulling="2025-10-01 13:19:31.117496029 +0000 UTC m=+831.171480928" observedRunningTime="2025-10-01 13:19:31.690274675 +0000 UTC m=+831.744259614" watchObservedRunningTime="2025-10-01 13:19:31.695487236 +0000 UTC m=+831.749472185" Oct 01 13:19:36 crc kubenswrapper[4749]: I1001 13:19:36.624177 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nsv4j" podUID="bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" containerName="console" containerID="cri-o://9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91" gracePeriod=15 Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.073973 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nsv4j_bc67adb4-6956-4fa8-8dea-ca8e894ccb6d/console/0.log" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.074285 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.140906 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-config\") pod \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.141014 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-serving-cert\") pod \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.141085 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6tsd\" (UniqueName: \"kubernetes.io/projected/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-kube-api-access-g6tsd\") pod \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.141149 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-trusted-ca-bundle\") pod \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.141198 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-oauth-serving-cert\") pod \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.141253 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-service-ca\") pod \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.141319 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-oauth-config\") pod \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\" (UID: \"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d\") " Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.142031 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-config" (OuterVolumeSpecName: "console-config") pod "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" (UID: "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.142542 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" (UID: "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.142810 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" (UID: "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.143269 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-service-ca" (OuterVolumeSpecName: "service-ca") pod "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" (UID: "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.148898 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" (UID: "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.150348 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-kube-api-access-g6tsd" (OuterVolumeSpecName: "kube-api-access-g6tsd") pod "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" (UID: "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d"). InnerVolumeSpecName "kube-api-access-g6tsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.155654 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" (UID: "bc67adb4-6956-4fa8-8dea-ca8e894ccb6d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.243070 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6tsd\" (UniqueName: \"kubernetes.io/projected/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-kube-api-access-g6tsd\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.243298 4749 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.243379 4749 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.243449 4749 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.243546 4749 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.243640 4749 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.243710 4749 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.589803 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.589873 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.642867 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.698208 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nsv4j_bc67adb4-6956-4fa8-8dea-ca8e894ccb6d/console/0.log" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.698269 4749 generic.go:334] "Generic (PLEG): container finished" podID="bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" containerID="9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91" exitCode=2 Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.698350 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nsv4j" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.698336 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nsv4j" event={"ID":"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d","Type":"ContainerDied","Data":"9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91"} Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.698403 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nsv4j" event={"ID":"bc67adb4-6956-4fa8-8dea-ca8e894ccb6d","Type":"ContainerDied","Data":"adfa3a1ffa1141b238ffa865a22485ef5737c10a702ffbf8e058453175350053"} Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.698425 4749 scope.go:117] "RemoveContainer" containerID="9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.723174 4749 scope.go:117] "RemoveContainer" containerID="9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91" Oct 01 13:19:37 crc kubenswrapper[4749]: E1001 13:19:37.723694 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91\": container with ID starting with 9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91 not found: ID does not exist" containerID="9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.723720 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91"} err="failed to get container status \"9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91\": rpc error: code = NotFound desc = could not find container \"9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91\": container with ID starting with 9d94d90c4b066e3f30be7a978f02a15a64ca99fbbfceed4a75117da8e6d67a91 not found: ID does not exist" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.727358 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nsv4j"] Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.732933 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nsv4j"] Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.738996 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:37 crc kubenswrapper[4749]: I1001 13:19:37.868556 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-29bwh"] Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.323758 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd"] Oct 01 13:19:38 crc kubenswrapper[4749]: E1001 13:19:38.324298 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" containerName="console" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.324313 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" containerName="console" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.324428 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" containerName="console" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.325343 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.327779 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.342009 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd"] Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.368336 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/019c51ec-3989-4693-9e3c-01beb6bdff4a-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd\" (UID: \"019c51ec-3989-4693-9e3c-01beb6bdff4a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.368423 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/019c51ec-3989-4693-9e3c-01beb6bdff4a-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd\" (UID: \"019c51ec-3989-4693-9e3c-01beb6bdff4a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.368723 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwh5w\" (UniqueName: \"kubernetes.io/projected/019c51ec-3989-4693-9e3c-01beb6bdff4a-kube-api-access-hwh5w\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd\" (UID: \"019c51ec-3989-4693-9e3c-01beb6bdff4a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.469269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/019c51ec-3989-4693-9e3c-01beb6bdff4a-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd\" (UID: \"019c51ec-3989-4693-9e3c-01beb6bdff4a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.469328 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/019c51ec-3989-4693-9e3c-01beb6bdff4a-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd\" (UID: \"019c51ec-3989-4693-9e3c-01beb6bdff4a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.469390 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwh5w\" (UniqueName: \"kubernetes.io/projected/019c51ec-3989-4693-9e3c-01beb6bdff4a-kube-api-access-hwh5w\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd\" (UID: \"019c51ec-3989-4693-9e3c-01beb6bdff4a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.470427 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/019c51ec-3989-4693-9e3c-01beb6bdff4a-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd\" (UID: \"019c51ec-3989-4693-9e3c-01beb6bdff4a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.470508 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/019c51ec-3989-4693-9e3c-01beb6bdff4a-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd\" (UID: \"019c51ec-3989-4693-9e3c-01beb6bdff4a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.493005 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwh5w\" (UniqueName: \"kubernetes.io/projected/019c51ec-3989-4693-9e3c-01beb6bdff4a-kube-api-access-hwh5w\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd\" (UID: \"019c51ec-3989-4693-9e3c-01beb6bdff4a\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.639981 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:38 crc kubenswrapper[4749]: I1001 13:19:38.916135 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd"] Oct 01 13:19:39 crc kubenswrapper[4749]: I1001 13:19:39.245181 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc67adb4-6956-4fa8-8dea-ca8e894ccb6d" path="/var/lib/kubelet/pods/bc67adb4-6956-4fa8-8dea-ca8e894ccb6d/volumes" Oct 01 13:19:39 crc kubenswrapper[4749]: I1001 13:19:39.711970 4749 generic.go:334] "Generic (PLEG): container finished" podID="019c51ec-3989-4693-9e3c-01beb6bdff4a" containerID="82aa1b16e79fbbb1f755b610c2d0e55aee63f1dc24d7556a63cdcad85c7626e5" exitCode=0 Oct 01 13:19:39 crc kubenswrapper[4749]: I1001 13:19:39.712010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" event={"ID":"019c51ec-3989-4693-9e3c-01beb6bdff4a","Type":"ContainerDied","Data":"82aa1b16e79fbbb1f755b610c2d0e55aee63f1dc24d7556a63cdcad85c7626e5"} Oct 01 13:19:39 crc kubenswrapper[4749]: I1001 13:19:39.712054 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" event={"ID":"019c51ec-3989-4693-9e3c-01beb6bdff4a","Type":"ContainerStarted","Data":"d70dd419cfbf95b17622ddfed5719ce248e4dab762173ab5986b95a061ebae89"} Oct 01 13:19:39 crc kubenswrapper[4749]: I1001 13:19:39.712267 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-29bwh" podUID="a9955d33-193a-47fc-905b-bfc91880f5af" containerName="registry-server" containerID="cri-o://365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770" gracePeriod=2 Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.191555 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.297632 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9955d33-193a-47fc-905b-bfc91880f5af-catalog-content\") pod \"a9955d33-193a-47fc-905b-bfc91880f5af\" (UID: \"a9955d33-193a-47fc-905b-bfc91880f5af\") " Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.297757 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x49b8\" (UniqueName: \"kubernetes.io/projected/a9955d33-193a-47fc-905b-bfc91880f5af-kube-api-access-x49b8\") pod \"a9955d33-193a-47fc-905b-bfc91880f5af\" (UID: \"a9955d33-193a-47fc-905b-bfc91880f5af\") " Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.297860 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9955d33-193a-47fc-905b-bfc91880f5af-utilities\") pod \"a9955d33-193a-47fc-905b-bfc91880f5af\" (UID: \"a9955d33-193a-47fc-905b-bfc91880f5af\") " Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.299003 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9955d33-193a-47fc-905b-bfc91880f5af-utilities" (OuterVolumeSpecName: "utilities") pod "a9955d33-193a-47fc-905b-bfc91880f5af" (UID: "a9955d33-193a-47fc-905b-bfc91880f5af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.304596 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9955d33-193a-47fc-905b-bfc91880f5af-kube-api-access-x49b8" (OuterVolumeSpecName: "kube-api-access-x49b8") pod "a9955d33-193a-47fc-905b-bfc91880f5af" (UID: "a9955d33-193a-47fc-905b-bfc91880f5af"). InnerVolumeSpecName "kube-api-access-x49b8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.374672 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9955d33-193a-47fc-905b-bfc91880f5af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9955d33-193a-47fc-905b-bfc91880f5af" (UID: "a9955d33-193a-47fc-905b-bfc91880f5af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.399089 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9955d33-193a-47fc-905b-bfc91880f5af-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.399148 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9955d33-193a-47fc-905b-bfc91880f5af-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.399169 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x49b8\" (UniqueName: \"kubernetes.io/projected/a9955d33-193a-47fc-905b-bfc91880f5af-kube-api-access-x49b8\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.722181 4749 generic.go:334] "Generic (PLEG): container finished" podID="a9955d33-193a-47fc-905b-bfc91880f5af" containerID="365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770" exitCode=0 Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.722264 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29bwh" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.722261 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bwh" event={"ID":"a9955d33-193a-47fc-905b-bfc91880f5af","Type":"ContainerDied","Data":"365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770"} Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.722438 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bwh" event={"ID":"a9955d33-193a-47fc-905b-bfc91880f5af","Type":"ContainerDied","Data":"971c01da808826f896ad438844594be3e4a274a7ae4d0ac92cec30afc620de07"} Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.722473 4749 scope.go:117] "RemoveContainer" containerID="365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.753666 4749 scope.go:117] "RemoveContainer" containerID="8e7524f2dc60cf456fa223545350cae2139c1c0d08ff91fb819d054ef5cadb9f" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.779507 4749 scope.go:117] "RemoveContainer" containerID="a9395426aaed410e9f90ac32c23655d0059bd2ab591f2cafef84d6d52f15554f" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.780977 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-29bwh"] Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.790737 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-29bwh"] Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.816599 4749 scope.go:117] "RemoveContainer" containerID="365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770" Oct 01 13:19:40 crc kubenswrapper[4749]: E1001 13:19:40.817746 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770\": container with ID starting with 365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770 not found: ID does not exist" containerID="365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.817802 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770"} err="failed to get container status \"365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770\": rpc error: code = NotFound desc = could not find container \"365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770\": container with ID starting with 365997e579546b2cb1bef185c1081934ad9a95376a49f2c1a96156c28bdf4770 not found: ID does not exist" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.817837 4749 scope.go:117] "RemoveContainer" containerID="8e7524f2dc60cf456fa223545350cae2139c1c0d08ff91fb819d054ef5cadb9f" Oct 01 13:19:40 crc kubenswrapper[4749]: E1001 13:19:40.818443 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7524f2dc60cf456fa223545350cae2139c1c0d08ff91fb819d054ef5cadb9f\": container with ID starting with 8e7524f2dc60cf456fa223545350cae2139c1c0d08ff91fb819d054ef5cadb9f not found: ID does not exist" containerID="8e7524f2dc60cf456fa223545350cae2139c1c0d08ff91fb819d054ef5cadb9f" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.818516 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7524f2dc60cf456fa223545350cae2139c1c0d08ff91fb819d054ef5cadb9f"} err="failed to get container status \"8e7524f2dc60cf456fa223545350cae2139c1c0d08ff91fb819d054ef5cadb9f\": rpc error: code = NotFound desc = could not find container \"8e7524f2dc60cf456fa223545350cae2139c1c0d08ff91fb819d054ef5cadb9f\": container with ID starting with 8e7524f2dc60cf456fa223545350cae2139c1c0d08ff91fb819d054ef5cadb9f not found: ID does not exist" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.818561 4749 scope.go:117] "RemoveContainer" containerID="a9395426aaed410e9f90ac32c23655d0059bd2ab591f2cafef84d6d52f15554f" Oct 01 13:19:40 crc kubenswrapper[4749]: E1001 13:19:40.819090 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9395426aaed410e9f90ac32c23655d0059bd2ab591f2cafef84d6d52f15554f\": container with ID starting with a9395426aaed410e9f90ac32c23655d0059bd2ab591f2cafef84d6d52f15554f not found: ID does not exist" containerID="a9395426aaed410e9f90ac32c23655d0059bd2ab591f2cafef84d6d52f15554f" Oct 01 13:19:40 crc kubenswrapper[4749]: I1001 13:19:40.819140 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9395426aaed410e9f90ac32c23655d0059bd2ab591f2cafef84d6d52f15554f"} err="failed to get container status \"a9395426aaed410e9f90ac32c23655d0059bd2ab591f2cafef84d6d52f15554f\": rpc error: code = NotFound desc = could not find container \"a9395426aaed410e9f90ac32c23655d0059bd2ab591f2cafef84d6d52f15554f\": container with ID starting with a9395426aaed410e9f90ac32c23655d0059bd2ab591f2cafef84d6d52f15554f not found: ID does not exist" Oct 01 13:19:41 crc kubenswrapper[4749]: I1001 13:19:41.241958 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9955d33-193a-47fc-905b-bfc91880f5af" path="/var/lib/kubelet/pods/a9955d33-193a-47fc-905b-bfc91880f5af/volumes" Oct 01 13:19:42 crc kubenswrapper[4749]: I1001 13:19:42.742427 4749 generic.go:334] "Generic (PLEG): container finished" podID="019c51ec-3989-4693-9e3c-01beb6bdff4a" containerID="6c6a6e2434c08683bb6d248fae95dc3d3e2b784caa4d6e44b0cc5ebebd567e8b" exitCode=0 Oct 01 13:19:42 crc kubenswrapper[4749]: I1001 13:19:42.742514 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" event={"ID":"019c51ec-3989-4693-9e3c-01beb6bdff4a","Type":"ContainerDied","Data":"6c6a6e2434c08683bb6d248fae95dc3d3e2b784caa4d6e44b0cc5ebebd567e8b"} Oct 01 13:19:43 crc kubenswrapper[4749]: I1001 13:19:43.753551 4749 generic.go:334] "Generic (PLEG): container finished" podID="019c51ec-3989-4693-9e3c-01beb6bdff4a" containerID="5253216d40474cd8cb584add336b654a840864b1f11b45524925a2d77b5025c4" exitCode=0 Oct 01 13:19:43 crc kubenswrapper[4749]: I1001 13:19:43.753600 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" event={"ID":"019c51ec-3989-4693-9e3c-01beb6bdff4a","Type":"ContainerDied","Data":"5253216d40474cd8cb584add336b654a840864b1f11b45524925a2d77b5025c4"} Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.137199 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.174820 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/019c51ec-3989-4693-9e3c-01beb6bdff4a-bundle\") pod \"019c51ec-3989-4693-9e3c-01beb6bdff4a\" (UID: \"019c51ec-3989-4693-9e3c-01beb6bdff4a\") " Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.174908 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/019c51ec-3989-4693-9e3c-01beb6bdff4a-util\") pod \"019c51ec-3989-4693-9e3c-01beb6bdff4a\" (UID: \"019c51ec-3989-4693-9e3c-01beb6bdff4a\") " Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.174960 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwh5w\" (UniqueName: \"kubernetes.io/projected/019c51ec-3989-4693-9e3c-01beb6bdff4a-kube-api-access-hwh5w\") pod \"019c51ec-3989-4693-9e3c-01beb6bdff4a\" (UID: \"019c51ec-3989-4693-9e3c-01beb6bdff4a\") " Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.176336 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/019c51ec-3989-4693-9e3c-01beb6bdff4a-bundle" (OuterVolumeSpecName: "bundle") pod "019c51ec-3989-4693-9e3c-01beb6bdff4a" (UID: "019c51ec-3989-4693-9e3c-01beb6bdff4a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.182851 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/019c51ec-3989-4693-9e3c-01beb6bdff4a-kube-api-access-hwh5w" (OuterVolumeSpecName: "kube-api-access-hwh5w") pod "019c51ec-3989-4693-9e3c-01beb6bdff4a" (UID: "019c51ec-3989-4693-9e3c-01beb6bdff4a"). InnerVolumeSpecName "kube-api-access-hwh5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.197980 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/019c51ec-3989-4693-9e3c-01beb6bdff4a-util" (OuterVolumeSpecName: "util") pod "019c51ec-3989-4693-9e3c-01beb6bdff4a" (UID: "019c51ec-3989-4693-9e3c-01beb6bdff4a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.278940 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/019c51ec-3989-4693-9e3c-01beb6bdff4a-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.278989 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/019c51ec-3989-4693-9e3c-01beb6bdff4a-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.279009 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwh5w\" (UniqueName: \"kubernetes.io/projected/019c51ec-3989-4693-9e3c-01beb6bdff4a-kube-api-access-hwh5w\") on node \"crc\" DevicePath \"\"" Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.772937 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" event={"ID":"019c51ec-3989-4693-9e3c-01beb6bdff4a","Type":"ContainerDied","Data":"d70dd419cfbf95b17622ddfed5719ce248e4dab762173ab5986b95a061ebae89"} Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.773397 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d70dd419cfbf95b17622ddfed5719ce248e4dab762173ab5986b95a061ebae89" Oct 01 13:19:45 crc kubenswrapper[4749]: I1001 13:19:45.773056 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.055133 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8"] Oct 01 13:19:57 crc kubenswrapper[4749]: E1001 13:19:57.056019 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019c51ec-3989-4693-9e3c-01beb6bdff4a" containerName="pull" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.056037 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="019c51ec-3989-4693-9e3c-01beb6bdff4a" containerName="pull" Oct 01 13:19:57 crc kubenswrapper[4749]: E1001 13:19:57.056050 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9955d33-193a-47fc-905b-bfc91880f5af" containerName="extract-utilities" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.056058 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9955d33-193a-47fc-905b-bfc91880f5af" containerName="extract-utilities" Oct 01 13:19:57 crc kubenswrapper[4749]: E1001 13:19:57.056068 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9955d33-193a-47fc-905b-bfc91880f5af" containerName="registry-server" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.056077 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9955d33-193a-47fc-905b-bfc91880f5af" containerName="registry-server" Oct 01 13:19:57 crc kubenswrapper[4749]: E1001 13:19:57.056093 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9955d33-193a-47fc-905b-bfc91880f5af" containerName="extract-content" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.056100 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9955d33-193a-47fc-905b-bfc91880f5af" containerName="extract-content" Oct 01 13:19:57 crc kubenswrapper[4749]: E1001 13:19:57.056111 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019c51ec-3989-4693-9e3c-01beb6bdff4a" containerName="extract" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.056118 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="019c51ec-3989-4693-9e3c-01beb6bdff4a" containerName="extract" Oct 01 13:19:57 crc kubenswrapper[4749]: E1001 13:19:57.056133 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019c51ec-3989-4693-9e3c-01beb6bdff4a" containerName="util" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.056140 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="019c51ec-3989-4693-9e3c-01beb6bdff4a" containerName="util" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.056282 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="019c51ec-3989-4693-9e3c-01beb6bdff4a" containerName="extract" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.056299 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9955d33-193a-47fc-905b-bfc91880f5af" containerName="registry-server" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.056818 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.058976 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-b4wjz" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.059157 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.059660 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.059872 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.060147 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.068069 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8"] Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.236615 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3cf2530f-bd63-401b-992b-51f01a86598c-webhook-cert\") pod \"metallb-operator-controller-manager-87f8f4bcc-p49h8\" (UID: \"3cf2530f-bd63-401b-992b-51f01a86598c\") " pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.236703 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p8t4\" (UniqueName: \"kubernetes.io/projected/3cf2530f-bd63-401b-992b-51f01a86598c-kube-api-access-8p8t4\") pod \"metallb-operator-controller-manager-87f8f4bcc-p49h8\" (UID: \"3cf2530f-bd63-401b-992b-51f01a86598c\") " pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.236810 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3cf2530f-bd63-401b-992b-51f01a86598c-apiservice-cert\") pod \"metallb-operator-controller-manager-87f8f4bcc-p49h8\" (UID: \"3cf2530f-bd63-401b-992b-51f01a86598c\") " pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.309266 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv"] Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.309904 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.311969 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.312707 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.312758 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-dqw8x" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.328486 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv"] Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.343199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl6sl\" (UniqueName: \"kubernetes.io/projected/c2aa45a1-115c-47cc-9b5f-d1a79549a3a8-kube-api-access-gl6sl\") pod \"metallb-operator-webhook-server-5b54b59d49-p8dqv\" (UID: \"c2aa45a1-115c-47cc-9b5f-d1a79549a3a8\") " pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.343260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p8t4\" (UniqueName: \"kubernetes.io/projected/3cf2530f-bd63-401b-992b-51f01a86598c-kube-api-access-8p8t4\") pod \"metallb-operator-controller-manager-87f8f4bcc-p49h8\" (UID: \"3cf2530f-bd63-401b-992b-51f01a86598c\") " pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.343294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3cf2530f-bd63-401b-992b-51f01a86598c-apiservice-cert\") pod \"metallb-operator-controller-manager-87f8f4bcc-p49h8\" (UID: \"3cf2530f-bd63-401b-992b-51f01a86598c\") " pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.343314 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2aa45a1-115c-47cc-9b5f-d1a79549a3a8-webhook-cert\") pod \"metallb-operator-webhook-server-5b54b59d49-p8dqv\" (UID: \"c2aa45a1-115c-47cc-9b5f-d1a79549a3a8\") " pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.343339 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2aa45a1-115c-47cc-9b5f-d1a79549a3a8-apiservice-cert\") pod \"metallb-operator-webhook-server-5b54b59d49-p8dqv\" (UID: \"c2aa45a1-115c-47cc-9b5f-d1a79549a3a8\") " pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.343359 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3cf2530f-bd63-401b-992b-51f01a86598c-webhook-cert\") pod \"metallb-operator-controller-manager-87f8f4bcc-p49h8\" (UID: \"3cf2530f-bd63-401b-992b-51f01a86598c\") " pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.348367 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3cf2530f-bd63-401b-992b-51f01a86598c-webhook-cert\") pod \"metallb-operator-controller-manager-87f8f4bcc-p49h8\" (UID: \"3cf2530f-bd63-401b-992b-51f01a86598c\") " pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.348448 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3cf2530f-bd63-401b-992b-51f01a86598c-apiservice-cert\") pod \"metallb-operator-controller-manager-87f8f4bcc-p49h8\" (UID: \"3cf2530f-bd63-401b-992b-51f01a86598c\") " pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.374504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p8t4\" (UniqueName: \"kubernetes.io/projected/3cf2530f-bd63-401b-992b-51f01a86598c-kube-api-access-8p8t4\") pod \"metallb-operator-controller-manager-87f8f4bcc-p49h8\" (UID: \"3cf2530f-bd63-401b-992b-51f01a86598c\") " pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.378602 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.444497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl6sl\" (UniqueName: \"kubernetes.io/projected/c2aa45a1-115c-47cc-9b5f-d1a79549a3a8-kube-api-access-gl6sl\") pod \"metallb-operator-webhook-server-5b54b59d49-p8dqv\" (UID: \"c2aa45a1-115c-47cc-9b5f-d1a79549a3a8\") " pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.444786 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2aa45a1-115c-47cc-9b5f-d1a79549a3a8-webhook-cert\") pod \"metallb-operator-webhook-server-5b54b59d49-p8dqv\" (UID: \"c2aa45a1-115c-47cc-9b5f-d1a79549a3a8\") " pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.444817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2aa45a1-115c-47cc-9b5f-d1a79549a3a8-apiservice-cert\") pod \"metallb-operator-webhook-server-5b54b59d49-p8dqv\" (UID: \"c2aa45a1-115c-47cc-9b5f-d1a79549a3a8\") " pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.447923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2aa45a1-115c-47cc-9b5f-d1a79549a3a8-apiservice-cert\") pod \"metallb-operator-webhook-server-5b54b59d49-p8dqv\" (UID: \"c2aa45a1-115c-47cc-9b5f-d1a79549a3a8\") " pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.447922 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2aa45a1-115c-47cc-9b5f-d1a79549a3a8-webhook-cert\") pod \"metallb-operator-webhook-server-5b54b59d49-p8dqv\" (UID: \"c2aa45a1-115c-47cc-9b5f-d1a79549a3a8\") " pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.461635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl6sl\" (UniqueName: \"kubernetes.io/projected/c2aa45a1-115c-47cc-9b5f-d1a79549a3a8-kube-api-access-gl6sl\") pod \"metallb-operator-webhook-server-5b54b59d49-p8dqv\" (UID: \"c2aa45a1-115c-47cc-9b5f-d1a79549a3a8\") " pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.623494 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.812788 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8"] Oct 01 13:19:57 crc kubenswrapper[4749]: I1001 13:19:57.847988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" event={"ID":"3cf2530f-bd63-401b-992b-51f01a86598c","Type":"ContainerStarted","Data":"7896ed277c297d7d6dad64f5f43f04c01f091aee26bdaaf5239da8d7b94c7e35"} Oct 01 13:19:58 crc kubenswrapper[4749]: I1001 13:19:58.094987 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv"] Oct 01 13:19:58 crc kubenswrapper[4749]: W1001 13:19:58.105195 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2aa45a1_115c_47cc_9b5f_d1a79549a3a8.slice/crio-29414289e53c486dc93a7e12ec0a93d4cc9f86fdd966e3b85acc3c88c1f8efc5 WatchSource:0}: Error finding container 29414289e53c486dc93a7e12ec0a93d4cc9f86fdd966e3b85acc3c88c1f8efc5: Status 404 returned error can't find the container with id 29414289e53c486dc93a7e12ec0a93d4cc9f86fdd966e3b85acc3c88c1f8efc5 Oct 01 13:19:58 crc kubenswrapper[4749]: I1001 13:19:58.855180 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" event={"ID":"c2aa45a1-115c-47cc-9b5f-d1a79549a3a8","Type":"ContainerStarted","Data":"29414289e53c486dc93a7e12ec0a93d4cc9f86fdd966e3b85acc3c88c1f8efc5"} Oct 01 13:20:01 crc kubenswrapper[4749]: I1001 13:20:01.875614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" event={"ID":"3cf2530f-bd63-401b-992b-51f01a86598c","Type":"ContainerStarted","Data":"5444c40c799b121d0a6250bfd590deefae8255659738098815fb11cbdf48c2ba"} Oct 01 13:20:01 crc kubenswrapper[4749]: I1001 13:20:01.877245 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:20:01 crc kubenswrapper[4749]: I1001 13:20:01.900841 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" podStartSLOduration=1.838611445 podStartE2EDuration="4.900823652s" podCreationTimestamp="2025-10-01 13:19:57 +0000 UTC" firstStartedPulling="2025-10-01 13:19:57.829995757 +0000 UTC m=+857.883980656" lastFinishedPulling="2025-10-01 13:20:00.892207954 +0000 UTC m=+860.946192863" observedRunningTime="2025-10-01 13:20:01.899262367 +0000 UTC m=+861.953247276" watchObservedRunningTime="2025-10-01 13:20:01.900823652 +0000 UTC m=+861.954808551" Oct 01 13:20:03 crc kubenswrapper[4749]: I1001 13:20:03.892592 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" event={"ID":"c2aa45a1-115c-47cc-9b5f-d1a79549a3a8","Type":"ContainerStarted","Data":"832e0326a59d21be9cafa1915bfce2f35b9f467637cc6dfe2a28956537f8da25"} Oct 01 13:20:03 crc kubenswrapper[4749]: I1001 13:20:03.893056 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:20:03 crc kubenswrapper[4749]: I1001 13:20:03.926746 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" podStartSLOduration=1.9024782770000002 podStartE2EDuration="6.926729028s" podCreationTimestamp="2025-10-01 13:19:57 +0000 UTC" firstStartedPulling="2025-10-01 13:19:58.10864858 +0000 UTC m=+858.162633499" lastFinishedPulling="2025-10-01 13:20:03.132899351 +0000 UTC m=+863.186884250" observedRunningTime="2025-10-01 13:20:03.923174245 +0000 UTC m=+863.977159144" watchObservedRunningTime="2025-10-01 13:20:03.926729028 +0000 UTC m=+863.980713937" Oct 01 13:20:17 crc kubenswrapper[4749]: I1001 13:20:17.646209 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5b54b59d49-p8dqv" Oct 01 13:20:32 crc kubenswrapper[4749]: I1001 13:20:32.106330 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:20:32 crc kubenswrapper[4749]: I1001 13:20:32.107120 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:20:37 crc kubenswrapper[4749]: I1001 13:20:37.382343 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-87f8f4bcc-p49h8" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.127419 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs"] Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.130152 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.132093 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-zfzfb" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.134094 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.145011 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-vhppt"] Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.147787 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.153928 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.154393 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.159285 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-frr-sockets\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.159402 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a398c955-5f6f-4519-8ad2-77d151718daf-cert\") pod \"frr-k8s-webhook-server-5478bdb765-g9dzs\" (UID: \"a398c955-5f6f-4519-8ad2-77d151718daf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.159439 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-metrics-certs\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.159485 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-frr-conf\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.159568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wg77\" (UniqueName: \"kubernetes.io/projected/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-kube-api-access-5wg77\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.159605 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-reloader\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.159641 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-metrics\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.159771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gskwl\" (UniqueName: \"kubernetes.io/projected/a398c955-5f6f-4519-8ad2-77d151718daf-kube-api-access-gskwl\") pod \"frr-k8s-webhook-server-5478bdb765-g9dzs\" (UID: \"a398c955-5f6f-4519-8ad2-77d151718daf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.159831 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-frr-startup\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.201363 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs"] Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.222664 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-nx44s"] Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.223517 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.225306 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.226067 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-k8pvt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.226273 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.226545 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.247713 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-zrnd6"] Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.248800 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.250427 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.260536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wg77\" (UniqueName: \"kubernetes.io/projected/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-kube-api-access-5wg77\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.260571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-reloader\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.260590 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cf11208c-e4d3-4873-872a-9b6b168ff648-metallb-excludel2\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.260656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-metrics\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.260674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-memberlist\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.260695 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gskwl\" (UniqueName: \"kubernetes.io/projected/a398c955-5f6f-4519-8ad2-77d151718daf-kube-api-access-gskwl\") pod \"frr-k8s-webhook-server-5478bdb765-g9dzs\" (UID: \"a398c955-5f6f-4519-8ad2-77d151718daf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261032 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-frr-startup\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/286d5dec-6b31-4235-ae91-705174a2aa4e-cert\") pod \"controller-5d688f5ffc-zrnd6\" (UID: \"286d5dec-6b31-4235-ae91-705174a2aa4e\") " pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261092 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-frr-sockets\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-metrics\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261117 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-metrics-certs\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261232 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwjd2\" (UniqueName: \"kubernetes.io/projected/cf11208c-e4d3-4873-872a-9b6b168ff648-kube-api-access-gwjd2\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261283 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtwzq\" (UniqueName: \"kubernetes.io/projected/286d5dec-6b31-4235-ae91-705174a2aa4e-kube-api-access-rtwzq\") pod \"controller-5d688f5ffc-zrnd6\" (UID: \"286d5dec-6b31-4235-ae91-705174a2aa4e\") " pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261451 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/286d5dec-6b31-4235-ae91-705174a2aa4e-metrics-certs\") pod \"controller-5d688f5ffc-zrnd6\" (UID: \"286d5dec-6b31-4235-ae91-705174a2aa4e\") " pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261490 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a398c955-5f6f-4519-8ad2-77d151718daf-cert\") pod \"frr-k8s-webhook-server-5478bdb765-g9dzs\" (UID: \"a398c955-5f6f-4519-8ad2-77d151718daf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261513 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-metrics-certs\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-frr-conf\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261822 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-frr-conf\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: E1001 13:20:38.261954 4749 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.261973 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-reloader\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: E1001 13:20:38.261997 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a398c955-5f6f-4519-8ad2-77d151718daf-cert podName:a398c955-5f6f-4519-8ad2-77d151718daf nodeName:}" failed. No retries permitted until 2025-10-01 13:20:38.761982323 +0000 UTC m=+898.815967222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a398c955-5f6f-4519-8ad2-77d151718daf-cert") pod "frr-k8s-webhook-server-5478bdb765-g9dzs" (UID: "a398c955-5f6f-4519-8ad2-77d151718daf") : secret "frr-k8s-webhook-server-cert" not found Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.262046 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-frr-startup\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.262378 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-frr-sockets\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.264243 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-zrnd6"] Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.270031 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-metrics-certs\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.277865 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wg77\" (UniqueName: \"kubernetes.io/projected/aadc0057-6a04-44e2-97cb-9f9f2e554f6f-kube-api-access-5wg77\") pod \"frr-k8s-vhppt\" (UID: \"aadc0057-6a04-44e2-97cb-9f9f2e554f6f\") " pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.292800 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gskwl\" (UniqueName: \"kubernetes.io/projected/a398c955-5f6f-4519-8ad2-77d151718daf-kube-api-access-gskwl\") pod \"frr-k8s-webhook-server-5478bdb765-g9dzs\" (UID: \"a398c955-5f6f-4519-8ad2-77d151718daf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.362377 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cf11208c-e4d3-4873-872a-9b6b168ff648-metallb-excludel2\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.362426 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-memberlist\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.362452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/286d5dec-6b31-4235-ae91-705174a2aa4e-cert\") pod \"controller-5d688f5ffc-zrnd6\" (UID: \"286d5dec-6b31-4235-ae91-705174a2aa4e\") " pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.362480 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-metrics-certs\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.362502 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwjd2\" (UniqueName: \"kubernetes.io/projected/cf11208c-e4d3-4873-872a-9b6b168ff648-kube-api-access-gwjd2\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.362531 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwzq\" (UniqueName: \"kubernetes.io/projected/286d5dec-6b31-4235-ae91-705174a2aa4e-kube-api-access-rtwzq\") pod \"controller-5d688f5ffc-zrnd6\" (UID: \"286d5dec-6b31-4235-ae91-705174a2aa4e\") " pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.362555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/286d5dec-6b31-4235-ae91-705174a2aa4e-metrics-certs\") pod \"controller-5d688f5ffc-zrnd6\" (UID: \"286d5dec-6b31-4235-ae91-705174a2aa4e\") " pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:38 crc kubenswrapper[4749]: E1001 13:20:38.362642 4749 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 01 13:20:38 crc kubenswrapper[4749]: E1001 13:20:38.362687 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/286d5dec-6b31-4235-ae91-705174a2aa4e-metrics-certs podName:286d5dec-6b31-4235-ae91-705174a2aa4e nodeName:}" failed. No retries permitted until 2025-10-01 13:20:38.862672253 +0000 UTC m=+898.916657152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/286d5dec-6b31-4235-ae91-705174a2aa4e-metrics-certs") pod "controller-5d688f5ffc-zrnd6" (UID: "286d5dec-6b31-4235-ae91-705174a2aa4e") : secret "controller-certs-secret" not found Oct 01 13:20:38 crc kubenswrapper[4749]: E1001 13:20:38.363175 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 13:20:38 crc kubenswrapper[4749]: E1001 13:20:38.363224 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-memberlist podName:cf11208c-e4d3-4873-872a-9b6b168ff648 nodeName:}" failed. No retries permitted until 2025-10-01 13:20:38.863204769 +0000 UTC m=+898.917189668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-memberlist") pod "speaker-nx44s" (UID: "cf11208c-e4d3-4873-872a-9b6b168ff648") : secret "metallb-memberlist" not found Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.363437 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cf11208c-e4d3-4873-872a-9b6b168ff648-metallb-excludel2\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.365454 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/286d5dec-6b31-4235-ae91-705174a2aa4e-cert\") pod \"controller-5d688f5ffc-zrnd6\" (UID: \"286d5dec-6b31-4235-ae91-705174a2aa4e\") " pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.366761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-metrics-certs\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.379734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwjd2\" (UniqueName: \"kubernetes.io/projected/cf11208c-e4d3-4873-872a-9b6b168ff648-kube-api-access-gwjd2\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.383835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtwzq\" (UniqueName: \"kubernetes.io/projected/286d5dec-6b31-4235-ae91-705174a2aa4e-kube-api-access-rtwzq\") pod \"controller-5d688f5ffc-zrnd6\" (UID: \"286d5dec-6b31-4235-ae91-705174a2aa4e\") " pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.466083 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.767824 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a398c955-5f6f-4519-8ad2-77d151718daf-cert\") pod \"frr-k8s-webhook-server-5478bdb765-g9dzs\" (UID: \"a398c955-5f6f-4519-8ad2-77d151718daf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.773035 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a398c955-5f6f-4519-8ad2-77d151718daf-cert\") pod \"frr-k8s-webhook-server-5478bdb765-g9dzs\" (UID: \"a398c955-5f6f-4519-8ad2-77d151718daf\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.868557 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/286d5dec-6b31-4235-ae91-705174a2aa4e-metrics-certs\") pod \"controller-5d688f5ffc-zrnd6\" (UID: \"286d5dec-6b31-4235-ae91-705174a2aa4e\") " pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.868627 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-memberlist\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:38 crc kubenswrapper[4749]: E1001 13:20:38.868755 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 13:20:38 crc kubenswrapper[4749]: E1001 13:20:38.868809 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-memberlist podName:cf11208c-e4d3-4873-872a-9b6b168ff648 nodeName:}" failed. No retries permitted until 2025-10-01 13:20:39.868795554 +0000 UTC m=+899.922780453 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-memberlist") pod "speaker-nx44s" (UID: "cf11208c-e4d3-4873-872a-9b6b168ff648") : secret "metallb-memberlist" not found Oct 01 13:20:38 crc kubenswrapper[4749]: I1001 13:20:38.874108 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/286d5dec-6b31-4235-ae91-705174a2aa4e-metrics-certs\") pod \"controller-5d688f5ffc-zrnd6\" (UID: \"286d5dec-6b31-4235-ae91-705174a2aa4e\") " pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:39 crc kubenswrapper[4749]: I1001 13:20:39.051812 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" Oct 01 13:20:39 crc kubenswrapper[4749]: I1001 13:20:39.133813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vhppt" event={"ID":"aadc0057-6a04-44e2-97cb-9f9f2e554f6f","Type":"ContainerStarted","Data":"722db9ad88610b964f78dd39208e5c8d4ba19a9d627b4a00ecc4db84e17408aa"} Oct 01 13:20:39 crc kubenswrapper[4749]: I1001 13:20:39.164078 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:39 crc kubenswrapper[4749]: I1001 13:20:39.575059 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs"] Oct 01 13:20:39 crc kubenswrapper[4749]: I1001 13:20:39.728067 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-zrnd6"] Oct 01 13:20:39 crc kubenswrapper[4749]: I1001 13:20:39.918533 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-memberlist\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:39 crc kubenswrapper[4749]: E1001 13:20:39.918747 4749 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 13:20:39 crc kubenswrapper[4749]: E1001 13:20:39.919068 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-memberlist podName:cf11208c-e4d3-4873-872a-9b6b168ff648 nodeName:}" failed. No retries permitted until 2025-10-01 13:20:41.919047229 +0000 UTC m=+901.973032138 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-memberlist") pod "speaker-nx44s" (UID: "cf11208c-e4d3-4873-872a-9b6b168ff648") : secret "metallb-memberlist" not found Oct 01 13:20:40 crc kubenswrapper[4749]: I1001 13:20:40.143729 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-zrnd6" event={"ID":"286d5dec-6b31-4235-ae91-705174a2aa4e","Type":"ContainerStarted","Data":"0822b733c79190dd46dbc525827cc5c8829306d97c655f0115b3d6ebcd5cfaaa"} Oct 01 13:20:40 crc kubenswrapper[4749]: I1001 13:20:40.144011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-zrnd6" event={"ID":"286d5dec-6b31-4235-ae91-705174a2aa4e","Type":"ContainerStarted","Data":"3f6163a52606c3dcb940fa8ef410b721bcce52a630cfcd35b9c1ade992f3e826"} Oct 01 13:20:40 crc kubenswrapper[4749]: I1001 13:20:40.144131 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-zrnd6" event={"ID":"286d5dec-6b31-4235-ae91-705174a2aa4e","Type":"ContainerStarted","Data":"fba44f69a7dd4db9f6865880d43b5f02d93af526fabd73c07ef39d0f1c5656e5"} Oct 01 13:20:40 crc kubenswrapper[4749]: I1001 13:20:40.144315 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:40 crc kubenswrapper[4749]: I1001 13:20:40.144754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" event={"ID":"a398c955-5f6f-4519-8ad2-77d151718daf","Type":"ContainerStarted","Data":"78922ee0e0709afdbd937ea2840210de392c61b08ee41edf146821796074e83d"} Oct 01 13:20:40 crc kubenswrapper[4749]: I1001 13:20:40.161671 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-zrnd6" podStartSLOduration=2.161652027 podStartE2EDuration="2.161652027s" podCreationTimestamp="2025-10-01 13:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:20:40.159545786 +0000 UTC m=+900.213530695" watchObservedRunningTime="2025-10-01 13:20:40.161652027 +0000 UTC m=+900.215636956" Oct 01 13:20:41 crc kubenswrapper[4749]: I1001 13:20:41.946993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-memberlist\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:41 crc kubenswrapper[4749]: I1001 13:20:41.957764 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf11208c-e4d3-4873-872a-9b6b168ff648-memberlist\") pod \"speaker-nx44s\" (UID: \"cf11208c-e4d3-4873-872a-9b6b168ff648\") " pod="metallb-system/speaker-nx44s" Oct 01 13:20:42 crc kubenswrapper[4749]: I1001 13:20:42.139864 4749 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-k8pvt" Oct 01 13:20:42 crc kubenswrapper[4749]: I1001 13:20:42.149194 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-nx44s" Oct 01 13:20:42 crc kubenswrapper[4749]: W1001 13:20:42.188826 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf11208c_e4d3_4873_872a_9b6b168ff648.slice/crio-aa86565b734fea056f1834fe0fb29cbf20cb6ecd63a2a2295dbdd395301f6605 WatchSource:0}: Error finding container aa86565b734fea056f1834fe0fb29cbf20cb6ecd63a2a2295dbdd395301f6605: Status 404 returned error can't find the container with id aa86565b734fea056f1834fe0fb29cbf20cb6ecd63a2a2295dbdd395301f6605 Oct 01 13:20:43 crc kubenswrapper[4749]: I1001 13:20:43.177726 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nx44s" event={"ID":"cf11208c-e4d3-4873-872a-9b6b168ff648","Type":"ContainerStarted","Data":"abd56eb00073adcb8e78b7b03c0581590d9f90f42495b96704165f30bcf9ceeb"} Oct 01 13:20:43 crc kubenswrapper[4749]: I1001 13:20:43.178031 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nx44s" event={"ID":"cf11208c-e4d3-4873-872a-9b6b168ff648","Type":"ContainerStarted","Data":"8b33a282d2c75e99d587c4ef6f03f7a8f16ffafd7b86e05e35b6cd30c947d92b"} Oct 01 13:20:43 crc kubenswrapper[4749]: I1001 13:20:43.178045 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-nx44s" event={"ID":"cf11208c-e4d3-4873-872a-9b6b168ff648","Type":"ContainerStarted","Data":"aa86565b734fea056f1834fe0fb29cbf20cb6ecd63a2a2295dbdd395301f6605"} Oct 01 13:20:43 crc kubenswrapper[4749]: I1001 13:20:43.178254 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-nx44s" Oct 01 13:20:47 crc kubenswrapper[4749]: I1001 13:20:47.215363 4749 generic.go:334] "Generic (PLEG): container finished" podID="aadc0057-6a04-44e2-97cb-9f9f2e554f6f" containerID="10a6d061399ff39efe8e569ad8bf5bd0817e8ecdb82132fa9dd9ecfafced5bd1" exitCode=0 Oct 01 13:20:47 crc kubenswrapper[4749]: I1001 13:20:47.215473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vhppt" event={"ID":"aadc0057-6a04-44e2-97cb-9f9f2e554f6f","Type":"ContainerDied","Data":"10a6d061399ff39efe8e569ad8bf5bd0817e8ecdb82132fa9dd9ecfafced5bd1"} Oct 01 13:20:47 crc kubenswrapper[4749]: I1001 13:20:47.218812 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" event={"ID":"a398c955-5f6f-4519-8ad2-77d151718daf","Type":"ContainerStarted","Data":"4650ea9d15917fe22e9587877ecd2dca68e8a4bc2a4682698a4c09afd008373c"} Oct 01 13:20:47 crc kubenswrapper[4749]: I1001 13:20:47.219150 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" Oct 01 13:20:47 crc kubenswrapper[4749]: I1001 13:20:47.260637 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-nx44s" podStartSLOduration=9.260618548 podStartE2EDuration="9.260618548s" podCreationTimestamp="2025-10-01 13:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:20:43.197032758 +0000 UTC m=+903.251017657" watchObservedRunningTime="2025-10-01 13:20:47.260618548 +0000 UTC m=+907.314603457" Oct 01 13:20:47 crc kubenswrapper[4749]: I1001 13:20:47.283871 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" podStartSLOduration=2.542390187 podStartE2EDuration="9.283847001s" podCreationTimestamp="2025-10-01 13:20:38 +0000 UTC" firstStartedPulling="2025-10-01 13:20:39.581124837 +0000 UTC m=+899.635109726" lastFinishedPulling="2025-10-01 13:20:46.322581601 +0000 UTC m=+906.376566540" observedRunningTime="2025-10-01 13:20:47.28240796 +0000 UTC m=+907.336392889" watchObservedRunningTime="2025-10-01 13:20:47.283847001 +0000 UTC m=+907.337831940" Oct 01 13:20:48 crc kubenswrapper[4749]: I1001 13:20:48.228754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vhppt" event={"ID":"aadc0057-6a04-44e2-97cb-9f9f2e554f6f","Type":"ContainerDied","Data":"6f6422b69bc65e28f1fce8414d8e1ffd14f822cfc8aa814b7d181761b5c98483"} Oct 01 13:20:48 crc kubenswrapper[4749]: I1001 13:20:48.228760 4749 generic.go:334] "Generic (PLEG): container finished" podID="aadc0057-6a04-44e2-97cb-9f9f2e554f6f" containerID="6f6422b69bc65e28f1fce8414d8e1ffd14f822cfc8aa814b7d181761b5c98483" exitCode=0 Oct 01 13:20:49 crc kubenswrapper[4749]: I1001 13:20:49.172493 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-zrnd6" Oct 01 13:20:49 crc kubenswrapper[4749]: I1001 13:20:49.237852 4749 generic.go:334] "Generic (PLEG): container finished" podID="aadc0057-6a04-44e2-97cb-9f9f2e554f6f" containerID="03d0f00e7421f056819b63792ec81ff2c24c10470458400f34b1e082b212dc5c" exitCode=0 Oct 01 13:20:49 crc kubenswrapper[4749]: I1001 13:20:49.245452 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vhppt" event={"ID":"aadc0057-6a04-44e2-97cb-9f9f2e554f6f","Type":"ContainerDied","Data":"03d0f00e7421f056819b63792ec81ff2c24c10470458400f34b1e082b212dc5c"} Oct 01 13:20:50 crc kubenswrapper[4749]: I1001 13:20:50.252871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vhppt" event={"ID":"aadc0057-6a04-44e2-97cb-9f9f2e554f6f","Type":"ContainerStarted","Data":"bdd8e09924fa8df562c5a12c5007e5f870d6cf34ca4c609c87c1a719c606a2e2"} Oct 01 13:20:50 crc kubenswrapper[4749]: I1001 13:20:50.253197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vhppt" event={"ID":"aadc0057-6a04-44e2-97cb-9f9f2e554f6f","Type":"ContainerStarted","Data":"bc07f2c0c1963e1dc57e69a1cb7346631abec5583a3f74b0023e7d41ffaf40d1"} Oct 01 13:20:50 crc kubenswrapper[4749]: I1001 13:20:50.253231 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vhppt" event={"ID":"aadc0057-6a04-44e2-97cb-9f9f2e554f6f","Type":"ContainerStarted","Data":"3ec043fe624196e4463fc525b7c10be54dda61515f3ae2c27d64d7a3002112da"} Oct 01 13:20:50 crc kubenswrapper[4749]: I1001 13:20:50.253244 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vhppt" event={"ID":"aadc0057-6a04-44e2-97cb-9f9f2e554f6f","Type":"ContainerStarted","Data":"7ee0b874dbebcbb0132abb82f75cbbf25e2335351ae392574a92b84de3f65a1a"} Oct 01 13:20:50 crc kubenswrapper[4749]: I1001 13:20:50.253255 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vhppt" event={"ID":"aadc0057-6a04-44e2-97cb-9f9f2e554f6f","Type":"ContainerStarted","Data":"52863c6c5397571f538c953711635888586978560fc54bdbd043d0f410d3dbc9"} Oct 01 13:20:51 crc kubenswrapper[4749]: I1001 13:20:51.271842 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-vhppt" event={"ID":"aadc0057-6a04-44e2-97cb-9f9f2e554f6f","Type":"ContainerStarted","Data":"f726921c60067330470ec95e34cd247776b057ffbeb44de48d3d7e898c6ed47c"} Oct 01 13:20:51 crc kubenswrapper[4749]: I1001 13:20:51.272202 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:51 crc kubenswrapper[4749]: I1001 13:20:51.311611 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-vhppt" podStartSLOduration=5.604816521 podStartE2EDuration="13.311577978s" podCreationTimestamp="2025-10-01 13:20:38 +0000 UTC" firstStartedPulling="2025-10-01 13:20:38.567120053 +0000 UTC m=+898.621104952" lastFinishedPulling="2025-10-01 13:20:46.27388147 +0000 UTC m=+906.327866409" observedRunningTime="2025-10-01 13:20:51.304465675 +0000 UTC m=+911.358450654" watchObservedRunningTime="2025-10-01 13:20:51.311577978 +0000 UTC m=+911.365562937" Oct 01 13:20:52 crc kubenswrapper[4749]: I1001 13:20:52.154920 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-nx44s" Oct 01 13:20:53 crc kubenswrapper[4749]: I1001 13:20:53.466878 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:53 crc kubenswrapper[4749]: I1001 13:20:53.550099 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-vhppt" Oct 01 13:20:58 crc kubenswrapper[4749]: I1001 13:20:58.836739 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-brkz8"] Oct 01 13:20:58 crc kubenswrapper[4749]: I1001 13:20:58.838591 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-brkz8" Oct 01 13:20:58 crc kubenswrapper[4749]: I1001 13:20:58.841187 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 01 13:20:58 crc kubenswrapper[4749]: I1001 13:20:58.841197 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 01 13:20:58 crc kubenswrapper[4749]: I1001 13:20:58.841310 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-z44zk" Oct 01 13:20:58 crc kubenswrapper[4749]: I1001 13:20:58.853629 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-brkz8"] Oct 01 13:20:58 crc kubenswrapper[4749]: I1001 13:20:58.990011 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8m8g\" (UniqueName: \"kubernetes.io/projected/a1ea70d3-b7ac-4f58-98eb-9c900ab32bff-kube-api-access-t8m8g\") pod \"openstack-operator-index-brkz8\" (UID: \"a1ea70d3-b7ac-4f58-98eb-9c900ab32bff\") " pod="openstack-operators/openstack-operator-index-brkz8" Oct 01 13:20:59 crc kubenswrapper[4749]: I1001 13:20:59.073062 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-g9dzs" Oct 01 13:20:59 crc kubenswrapper[4749]: I1001 13:20:59.091478 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8m8g\" (UniqueName: \"kubernetes.io/projected/a1ea70d3-b7ac-4f58-98eb-9c900ab32bff-kube-api-access-t8m8g\") pod \"openstack-operator-index-brkz8\" (UID: \"a1ea70d3-b7ac-4f58-98eb-9c900ab32bff\") " pod="openstack-operators/openstack-operator-index-brkz8" Oct 01 13:20:59 crc kubenswrapper[4749]: I1001 13:20:59.121309 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8m8g\" (UniqueName: \"kubernetes.io/projected/a1ea70d3-b7ac-4f58-98eb-9c900ab32bff-kube-api-access-t8m8g\") pod \"openstack-operator-index-brkz8\" (UID: \"a1ea70d3-b7ac-4f58-98eb-9c900ab32bff\") " pod="openstack-operators/openstack-operator-index-brkz8" Oct 01 13:20:59 crc kubenswrapper[4749]: I1001 13:20:59.162746 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-brkz8" Oct 01 13:20:59 crc kubenswrapper[4749]: I1001 13:20:59.444357 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-brkz8"] Oct 01 13:21:00 crc kubenswrapper[4749]: I1001 13:21:00.354995 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-brkz8" event={"ID":"a1ea70d3-b7ac-4f58-98eb-9c900ab32bff","Type":"ContainerStarted","Data":"c5ce94c718c22dc527afccaae6ca4b0dbcad615f2749070118f5a5175221af60"} Oct 01 13:21:02 crc kubenswrapper[4749]: I1001 13:21:02.107278 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:21:02 crc kubenswrapper[4749]: I1001 13:21:02.107944 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:21:02 crc kubenswrapper[4749]: I1001 13:21:02.371490 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-brkz8" event={"ID":"a1ea70d3-b7ac-4f58-98eb-9c900ab32bff","Type":"ContainerStarted","Data":"093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08"} Oct 01 13:21:02 crc kubenswrapper[4749]: I1001 13:21:02.392386 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-brkz8" podStartSLOduration=1.705982061 podStartE2EDuration="4.392357933s" podCreationTimestamp="2025-10-01 13:20:58 +0000 UTC" firstStartedPulling="2025-10-01 13:20:59.472638187 +0000 UTC m=+919.526623086" lastFinishedPulling="2025-10-01 13:21:02.159014059 +0000 UTC m=+922.212998958" observedRunningTime="2025-10-01 13:21:02.390206761 +0000 UTC m=+922.444191720" watchObservedRunningTime="2025-10-01 13:21:02.392357933 +0000 UTC m=+922.446342872" Oct 01 13:21:03 crc kubenswrapper[4749]: I1001 13:21:03.828335 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-brkz8"] Oct 01 13:21:04 crc kubenswrapper[4749]: I1001 13:21:04.230258 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dmmcd"] Oct 01 13:21:04 crc kubenswrapper[4749]: I1001 13:21:04.231507 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dmmcd" Oct 01 13:21:04 crc kubenswrapper[4749]: I1001 13:21:04.238258 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dmmcd"] Oct 01 13:21:04 crc kubenswrapper[4749]: I1001 13:21:04.376369 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptcgh\" (UniqueName: \"kubernetes.io/projected/855affad-2b74-41a1-89c8-d6eba2072bb7-kube-api-access-ptcgh\") pod \"openstack-operator-index-dmmcd\" (UID: \"855affad-2b74-41a1-89c8-d6eba2072bb7\") " pod="openstack-operators/openstack-operator-index-dmmcd" Oct 01 13:21:04 crc kubenswrapper[4749]: I1001 13:21:04.388466 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-brkz8" podUID="a1ea70d3-b7ac-4f58-98eb-9c900ab32bff" containerName="registry-server" containerID="cri-o://093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08" gracePeriod=2 Oct 01 13:21:04 crc kubenswrapper[4749]: I1001 13:21:04.478269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptcgh\" (UniqueName: \"kubernetes.io/projected/855affad-2b74-41a1-89c8-d6eba2072bb7-kube-api-access-ptcgh\") pod \"openstack-operator-index-dmmcd\" (UID: \"855affad-2b74-41a1-89c8-d6eba2072bb7\") " pod="openstack-operators/openstack-operator-index-dmmcd" Oct 01 13:21:04 crc kubenswrapper[4749]: I1001 13:21:04.519493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptcgh\" (UniqueName: \"kubernetes.io/projected/855affad-2b74-41a1-89c8-d6eba2072bb7-kube-api-access-ptcgh\") pod \"openstack-operator-index-dmmcd\" (UID: \"855affad-2b74-41a1-89c8-d6eba2072bb7\") " pod="openstack-operators/openstack-operator-index-dmmcd" Oct 01 13:21:04 crc kubenswrapper[4749]: I1001 13:21:04.588869 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dmmcd" Oct 01 13:21:04 crc kubenswrapper[4749]: I1001 13:21:04.827104 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-brkz8" Oct 01 13:21:04 crc kubenswrapper[4749]: I1001 13:21:04.986384 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8m8g\" (UniqueName: \"kubernetes.io/projected/a1ea70d3-b7ac-4f58-98eb-9c900ab32bff-kube-api-access-t8m8g\") pod \"a1ea70d3-b7ac-4f58-98eb-9c900ab32bff\" (UID: \"a1ea70d3-b7ac-4f58-98eb-9c900ab32bff\") " Oct 01 13:21:04 crc kubenswrapper[4749]: I1001 13:21:04.991931 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1ea70d3-b7ac-4f58-98eb-9c900ab32bff-kube-api-access-t8m8g" (OuterVolumeSpecName: "kube-api-access-t8m8g") pod "a1ea70d3-b7ac-4f58-98eb-9c900ab32bff" (UID: "a1ea70d3-b7ac-4f58-98eb-9c900ab32bff"). InnerVolumeSpecName "kube-api-access-t8m8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.073627 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dmmcd"] Oct 01 13:21:05 crc kubenswrapper[4749]: W1001 13:21:05.084323 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod855affad_2b74_41a1_89c8_d6eba2072bb7.slice/crio-f640051b8323d88dc4c3f71bb5dab2a5506292efaaa432bccf556ac9f92e11f1 WatchSource:0}: Error finding container f640051b8323d88dc4c3f71bb5dab2a5506292efaaa432bccf556ac9f92e11f1: Status 404 returned error can't find the container with id f640051b8323d88dc4c3f71bb5dab2a5506292efaaa432bccf556ac9f92e11f1 Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.087710 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8m8g\" (UniqueName: \"kubernetes.io/projected/a1ea70d3-b7ac-4f58-98eb-9c900ab32bff-kube-api-access-t8m8g\") on node \"crc\" DevicePath \"\"" Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.406529 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dmmcd" event={"ID":"855affad-2b74-41a1-89c8-d6eba2072bb7","Type":"ContainerStarted","Data":"dab1ecc2229a7529cebdb68599ece578deac94639365e941e24f89d63d243f8b"} Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.406593 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dmmcd" event={"ID":"855affad-2b74-41a1-89c8-d6eba2072bb7","Type":"ContainerStarted","Data":"f640051b8323d88dc4c3f71bb5dab2a5506292efaaa432bccf556ac9f92e11f1"} Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.411204 4749 generic.go:334] "Generic (PLEG): container finished" podID="a1ea70d3-b7ac-4f58-98eb-9c900ab32bff" containerID="093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08" exitCode=0 Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.411273 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-brkz8" event={"ID":"a1ea70d3-b7ac-4f58-98eb-9c900ab32bff","Type":"ContainerDied","Data":"093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08"} Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.411301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-brkz8" event={"ID":"a1ea70d3-b7ac-4f58-98eb-9c900ab32bff","Type":"ContainerDied","Data":"c5ce94c718c22dc527afccaae6ca4b0dbcad615f2749070118f5a5175221af60"} Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.411295 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-brkz8" Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.411320 4749 scope.go:117] "RemoveContainer" containerID="093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08" Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.433660 4749 scope.go:117] "RemoveContainer" containerID="093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08" Oct 01 13:21:05 crc kubenswrapper[4749]: E1001 13:21:05.434213 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08\": container with ID starting with 093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08 not found: ID does not exist" containerID="093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08" Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.434390 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08"} err="failed to get container status \"093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08\": rpc error: code = NotFound desc = could not find container \"093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08\": container with ID starting with 093ab359faad6a37a489c27358e93d139515eb04847d9135bdc56c4d22bf5e08 not found: ID does not exist" Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.444790 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dmmcd" podStartSLOduration=1.386249077 podStartE2EDuration="1.444763008s" podCreationTimestamp="2025-10-01 13:21:04 +0000 UTC" firstStartedPulling="2025-10-01 13:21:05.093029583 +0000 UTC m=+925.147014512" lastFinishedPulling="2025-10-01 13:21:05.151543504 +0000 UTC m=+925.205528443" observedRunningTime="2025-10-01 13:21:05.43959899 +0000 UTC m=+925.493583889" watchObservedRunningTime="2025-10-01 13:21:05.444763008 +0000 UTC m=+925.498747927" Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.452528 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-brkz8"] Oct 01 13:21:05 crc kubenswrapper[4749]: I1001 13:21:05.461785 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-brkz8"] Oct 01 13:21:07 crc kubenswrapper[4749]: I1001 13:21:07.242959 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1ea70d3-b7ac-4f58-98eb-9c900ab32bff" path="/var/lib/kubelet/pods/a1ea70d3-b7ac-4f58-98eb-9c900ab32bff/volumes" Oct 01 13:21:08 crc kubenswrapper[4749]: I1001 13:21:08.482394 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-vhppt" Oct 01 13:21:14 crc kubenswrapper[4749]: I1001 13:21:14.590338 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-dmmcd" Oct 01 13:21:14 crc kubenswrapper[4749]: I1001 13:21:14.590973 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-dmmcd" Oct 01 13:21:14 crc kubenswrapper[4749]: I1001 13:21:14.619981 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-dmmcd" Oct 01 13:21:15 crc kubenswrapper[4749]: I1001 13:21:15.502719 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-dmmcd" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.087717 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz"] Oct 01 13:21:23 crc kubenswrapper[4749]: E1001 13:21:23.088406 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1ea70d3-b7ac-4f58-98eb-9c900ab32bff" containerName="registry-server" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.088423 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1ea70d3-b7ac-4f58-98eb-9c900ab32bff" containerName="registry-server" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.088540 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1ea70d3-b7ac-4f58-98eb-9c900ab32bff" containerName="registry-server" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.089359 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.092950 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vrpp8" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.096711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-util\") pod \"f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz\" (UID: \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\") " pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.096816 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-bundle\") pod \"f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz\" (UID: \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\") " pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.096860 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75gfc\" (UniqueName: \"kubernetes.io/projected/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-kube-api-access-75gfc\") pod \"f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz\" (UID: \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\") " pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.098010 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz"] Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.198318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-util\") pod \"f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz\" (UID: \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\") " pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.198399 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-bundle\") pod \"f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz\" (UID: \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\") " pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.198434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75gfc\" (UniqueName: \"kubernetes.io/projected/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-kube-api-access-75gfc\") pod \"f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz\" (UID: \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\") " pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.198879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-util\") pod \"f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz\" (UID: \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\") " pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.199201 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-bundle\") pod \"f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz\" (UID: \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\") " pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.229585 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75gfc\" (UniqueName: \"kubernetes.io/projected/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-kube-api-access-75gfc\") pod \"f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz\" (UID: \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\") " pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.410006 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:23 crc kubenswrapper[4749]: I1001 13:21:23.688374 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz"] Oct 01 13:21:24 crc kubenswrapper[4749]: I1001 13:21:24.574504 4749 generic.go:334] "Generic (PLEG): container finished" podID="4ae96a8d-d561-4e2d-a16d-59dec09d2d98" containerID="725af94046a42718e6d70ff430bf56b3227c46ce6dbcaa6caf425f7b014da9be" exitCode=0 Oct 01 13:21:24 crc kubenswrapper[4749]: I1001 13:21:24.574581 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" event={"ID":"4ae96a8d-d561-4e2d-a16d-59dec09d2d98","Type":"ContainerDied","Data":"725af94046a42718e6d70ff430bf56b3227c46ce6dbcaa6caf425f7b014da9be"} Oct 01 13:21:24 crc kubenswrapper[4749]: I1001 13:21:24.574647 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" event={"ID":"4ae96a8d-d561-4e2d-a16d-59dec09d2d98","Type":"ContainerStarted","Data":"eefe516dba7fbb03a5631599914b8a1cd56b606f296bb3eda6ebf1a07ff0b2c4"} Oct 01 13:21:25 crc kubenswrapper[4749]: I1001 13:21:25.585533 4749 generic.go:334] "Generic (PLEG): container finished" podID="4ae96a8d-d561-4e2d-a16d-59dec09d2d98" containerID="90902c8a8ef2e6cf910d41924a2f9102fcefc09ddd0f698e74f5c0d481b8182d" exitCode=0 Oct 01 13:21:25 crc kubenswrapper[4749]: I1001 13:21:25.585584 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" event={"ID":"4ae96a8d-d561-4e2d-a16d-59dec09d2d98","Type":"ContainerDied","Data":"90902c8a8ef2e6cf910d41924a2f9102fcefc09ddd0f698e74f5c0d481b8182d"} Oct 01 13:21:26 crc kubenswrapper[4749]: I1001 13:21:26.593158 4749 generic.go:334] "Generic (PLEG): container finished" podID="4ae96a8d-d561-4e2d-a16d-59dec09d2d98" containerID="c41c64d68f03f2519ce9ee8d7b60eb025f9efe14c71aed07486abb4bb08ad848" exitCode=0 Oct 01 13:21:26 crc kubenswrapper[4749]: I1001 13:21:26.593201 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" event={"ID":"4ae96a8d-d561-4e2d-a16d-59dec09d2d98","Type":"ContainerDied","Data":"c41c64d68f03f2519ce9ee8d7b60eb025f9efe14c71aed07486abb4bb08ad848"} Oct 01 13:21:27 crc kubenswrapper[4749]: I1001 13:21:27.955077 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:28 crc kubenswrapper[4749]: I1001 13:21:28.000163 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-util\") pod \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\" (UID: \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\") " Oct 01 13:21:28 crc kubenswrapper[4749]: I1001 13:21:28.000317 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-bundle\") pod \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\" (UID: \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\") " Oct 01 13:21:28 crc kubenswrapper[4749]: I1001 13:21:28.000401 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75gfc\" (UniqueName: \"kubernetes.io/projected/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-kube-api-access-75gfc\") pod \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\" (UID: \"4ae96a8d-d561-4e2d-a16d-59dec09d2d98\") " Oct 01 13:21:28 crc kubenswrapper[4749]: I1001 13:21:28.001247 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-bundle" (OuterVolumeSpecName: "bundle") pod "4ae96a8d-d561-4e2d-a16d-59dec09d2d98" (UID: "4ae96a8d-d561-4e2d-a16d-59dec09d2d98"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:21:28 crc kubenswrapper[4749]: I1001 13:21:28.007493 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-kube-api-access-75gfc" (OuterVolumeSpecName: "kube-api-access-75gfc") pod "4ae96a8d-d561-4e2d-a16d-59dec09d2d98" (UID: "4ae96a8d-d561-4e2d-a16d-59dec09d2d98"). InnerVolumeSpecName "kube-api-access-75gfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:21:28 crc kubenswrapper[4749]: I1001 13:21:28.034493 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-util" (OuterVolumeSpecName: "util") pod "4ae96a8d-d561-4e2d-a16d-59dec09d2d98" (UID: "4ae96a8d-d561-4e2d-a16d-59dec09d2d98"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:21:28 crc kubenswrapper[4749]: I1001 13:21:28.101749 4749 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-util\") on node \"crc\" DevicePath \"\"" Oct 01 13:21:28 crc kubenswrapper[4749]: I1001 13:21:28.101782 4749 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:21:28 crc kubenswrapper[4749]: I1001 13:21:28.101792 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75gfc\" (UniqueName: \"kubernetes.io/projected/4ae96a8d-d561-4e2d-a16d-59dec09d2d98-kube-api-access-75gfc\") on node \"crc\" DevicePath \"\"" Oct 01 13:21:28 crc kubenswrapper[4749]: I1001 13:21:28.608341 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" event={"ID":"4ae96a8d-d561-4e2d-a16d-59dec09d2d98","Type":"ContainerDied","Data":"eefe516dba7fbb03a5631599914b8a1cd56b606f296bb3eda6ebf1a07ff0b2c4"} Oct 01 13:21:28 crc kubenswrapper[4749]: I1001 13:21:28.608603 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eefe516dba7fbb03a5631599914b8a1cd56b606f296bb3eda6ebf1a07ff0b2c4" Oct 01 13:21:28 crc kubenswrapper[4749]: I1001 13:21:28.608422 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz" Oct 01 13:21:32 crc kubenswrapper[4749]: I1001 13:21:32.106822 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:21:32 crc kubenswrapper[4749]: I1001 13:21:32.108513 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:21:32 crc kubenswrapper[4749]: I1001 13:21:32.108748 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:21:32 crc kubenswrapper[4749]: I1001 13:21:32.110118 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d0168c71ffab7f25d1f8d0338f65483c24fac9eb00fb19dbac8b302415e4b3e"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:21:32 crc kubenswrapper[4749]: I1001 13:21:32.110456 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://8d0168c71ffab7f25d1f8d0338f65483c24fac9eb00fb19dbac8b302415e4b3e" gracePeriod=600 Oct 01 13:21:32 crc kubenswrapper[4749]: I1001 13:21:32.648752 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="8d0168c71ffab7f25d1f8d0338f65483c24fac9eb00fb19dbac8b302415e4b3e" exitCode=0 Oct 01 13:21:32 crc kubenswrapper[4749]: I1001 13:21:32.648805 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"8d0168c71ffab7f25d1f8d0338f65483c24fac9eb00fb19dbac8b302415e4b3e"} Oct 01 13:21:32 crc kubenswrapper[4749]: I1001 13:21:32.648844 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"910cd885f70644be5e766281d9b0c0085bea0ad6a3102c2969a829e2725fb191"} Oct 01 13:21:32 crc kubenswrapper[4749]: I1001 13:21:32.648869 4749 scope.go:117] "RemoveContainer" containerID="bf9f42cda1c41e2e8bbd8a36b3bc094b1e73bf93791da0833fab72799f05a62e" Oct 01 13:21:34 crc kubenswrapper[4749]: I1001 13:21:34.698725 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq"] Oct 01 13:21:34 crc kubenswrapper[4749]: E1001 13:21:34.698941 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae96a8d-d561-4e2d-a16d-59dec09d2d98" containerName="util" Oct 01 13:21:34 crc kubenswrapper[4749]: I1001 13:21:34.698952 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae96a8d-d561-4e2d-a16d-59dec09d2d98" containerName="util" Oct 01 13:21:34 crc kubenswrapper[4749]: E1001 13:21:34.698964 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae96a8d-d561-4e2d-a16d-59dec09d2d98" containerName="extract" Oct 01 13:21:34 crc kubenswrapper[4749]: I1001 13:21:34.698970 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae96a8d-d561-4e2d-a16d-59dec09d2d98" containerName="extract" Oct 01 13:21:34 crc kubenswrapper[4749]: E1001 13:21:34.698980 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae96a8d-d561-4e2d-a16d-59dec09d2d98" containerName="pull" Oct 01 13:21:34 crc kubenswrapper[4749]: I1001 13:21:34.698986 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae96a8d-d561-4e2d-a16d-59dec09d2d98" containerName="pull" Oct 01 13:21:34 crc kubenswrapper[4749]: I1001 13:21:34.699115 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae96a8d-d561-4e2d-a16d-59dec09d2d98" containerName="extract" Oct 01 13:21:34 crc kubenswrapper[4749]: I1001 13:21:34.699729 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq" Oct 01 13:21:34 crc kubenswrapper[4749]: I1001 13:21:34.705097 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-mwg2w" Oct 01 13:21:34 crc kubenswrapper[4749]: I1001 13:21:34.793235 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq"] Oct 01 13:21:34 crc kubenswrapper[4749]: I1001 13:21:34.794805 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6wht\" (UniqueName: \"kubernetes.io/projected/b0603207-94a4-47e5-aff1-2572d337f429-kube-api-access-w6wht\") pod \"openstack-operator-controller-operator-7445ccf7db-stmwq\" (UID: \"b0603207-94a4-47e5-aff1-2572d337f429\") " pod="openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq" Oct 01 13:21:34 crc kubenswrapper[4749]: I1001 13:21:34.896540 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6wht\" (UniqueName: \"kubernetes.io/projected/b0603207-94a4-47e5-aff1-2572d337f429-kube-api-access-w6wht\") pod \"openstack-operator-controller-operator-7445ccf7db-stmwq\" (UID: \"b0603207-94a4-47e5-aff1-2572d337f429\") " pod="openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq" Oct 01 13:21:34 crc kubenswrapper[4749]: I1001 13:21:34.916968 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6wht\" (UniqueName: \"kubernetes.io/projected/b0603207-94a4-47e5-aff1-2572d337f429-kube-api-access-w6wht\") pod \"openstack-operator-controller-operator-7445ccf7db-stmwq\" (UID: \"b0603207-94a4-47e5-aff1-2572d337f429\") " pod="openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq" Oct 01 13:21:35 crc kubenswrapper[4749]: I1001 13:21:35.054729 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq" Oct 01 13:21:35 crc kubenswrapper[4749]: I1001 13:21:35.293047 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq"] Oct 01 13:21:35 crc kubenswrapper[4749]: I1001 13:21:35.677507 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq" event={"ID":"b0603207-94a4-47e5-aff1-2572d337f429","Type":"ContainerStarted","Data":"0f852a3971503f2f27b1039e536ce1efe5c829531dac54dc4d4e9e41a88e606b"} Oct 01 13:21:39 crc kubenswrapper[4749]: I1001 13:21:39.790329 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:21:40 crc kubenswrapper[4749]: I1001 13:21:40.710016 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq" event={"ID":"b0603207-94a4-47e5-aff1-2572d337f429","Type":"ContainerStarted","Data":"e51f2a1ebaba93f180cd45a20adadefa846f6b66120c9f5e6978ba4c188c706f"} Oct 01 13:21:42 crc kubenswrapper[4749]: I1001 13:21:42.728082 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq" event={"ID":"b0603207-94a4-47e5-aff1-2572d337f429","Type":"ContainerStarted","Data":"a16cc68fcfe7aabbb3bc56fcdb11bdb1b39e7ec47cb5a8a40f0bae10c061b43a"} Oct 01 13:21:42 crc kubenswrapper[4749]: I1001 13:21:42.728809 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq" Oct 01 13:21:42 crc kubenswrapper[4749]: I1001 13:21:42.767781 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq" podStartSLOduration=1.925896459 podStartE2EDuration="8.767760376s" podCreationTimestamp="2025-10-01 13:21:34 +0000 UTC" firstStartedPulling="2025-10-01 13:21:35.303817674 +0000 UTC m=+955.357802573" lastFinishedPulling="2025-10-01 13:21:42.145681591 +0000 UTC m=+962.199666490" observedRunningTime="2025-10-01 13:21:42.766716866 +0000 UTC m=+962.820701805" watchObservedRunningTime="2025-10-01 13:21:42.767760376 +0000 UTC m=+962.821745295" Oct 01 13:21:45 crc kubenswrapper[4749]: I1001 13:21:45.057852 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7445ccf7db-stmwq" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.379241 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.384567 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.389379 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-q97rs" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.405686 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.434421 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.437581 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.440795 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-7gt49" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.457277 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.458264 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.460153 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-xvnf7" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.464587 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.473274 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.474430 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.481246 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-njqr2" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.481386 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.490233 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.509336 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.510318 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.514357 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-mtq7z" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.520030 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.521315 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.525066 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-6zp52" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.535021 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j944\" (UniqueName: \"kubernetes.io/projected/4a7c1ef4-c125-445b-9f1e-b24ee27e2938-kube-api-access-7j944\") pod \"cinder-operator-controller-manager-644bddb6d8-58njn\" (UID: \"4a7c1ef4-c125-445b-9f1e-b24ee27e2938\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.535057 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fzkm\" (UniqueName: \"kubernetes.io/projected/0418f548-554a-4efc-8494-4edc9d56fc7f-kube-api-access-2fzkm\") pod \"designate-operator-controller-manager-84f4f7b77b-ck4dv\" (UID: \"0418f548-554a-4efc-8494-4edc9d56fc7f\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.535230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttzrk\" (UniqueName: \"kubernetes.io/projected/99a6dbcc-0b05-4471-b2d4-acacf72f6ff0-kube-api-access-ttzrk\") pod \"barbican-operator-controller-manager-6ff8b75857-vvxcs\" (UID: \"99a6dbcc-0b05-4471-b2d4-acacf72f6ff0\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.548407 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.573352 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.574819 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.577506 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-8jdcf" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.577729 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.582035 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.592146 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.596192 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.599329 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-4kxhw" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.611446 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.635310 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.636600 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.646130 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j944\" (UniqueName: \"kubernetes.io/projected/4a7c1ef4-c125-445b-9f1e-b24ee27e2938-kube-api-access-7j944\") pod \"cinder-operator-controller-manager-644bddb6d8-58njn\" (UID: \"4a7c1ef4-c125-445b-9f1e-b24ee27e2938\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.646184 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fzkm\" (UniqueName: \"kubernetes.io/projected/0418f548-554a-4efc-8494-4edc9d56fc7f-kube-api-access-2fzkm\") pod \"designate-operator-controller-manager-84f4f7b77b-ck4dv\" (UID: \"0418f548-554a-4efc-8494-4edc9d56fc7f\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.646318 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttzrk\" (UniqueName: \"kubernetes.io/projected/99a6dbcc-0b05-4471-b2d4-acacf72f6ff0-kube-api-access-ttzrk\") pod \"barbican-operator-controller-manager-6ff8b75857-vvxcs\" (UID: \"99a6dbcc-0b05-4471-b2d4-acacf72f6ff0\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.646337 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nt5m\" (UniqueName: \"kubernetes.io/projected/6647f74c-8bbb-490d-ade7-8b2fb5469ddc-kube-api-access-4nt5m\") pod \"glance-operator-controller-manager-84958c4d49-6w5m5\" (UID: \"6647f74c-8bbb-490d-ade7-8b2fb5469ddc\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.646363 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vrhz\" (UniqueName: \"kubernetes.io/projected/11409643-1cee-49c5-b3d0-fa1ec4cb1af0-kube-api-access-9vrhz\") pod \"heat-operator-controller-manager-5d889d78cf-h22cx\" (UID: \"11409643-1cee-49c5-b3d0-fa1ec4cb1af0\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.646382 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfrfk\" (UniqueName: \"kubernetes.io/projected/02d25e12-ca12-40f3-bc21-2b5a55fdba5d-kube-api-access-jfrfk\") pod \"horizon-operator-controller-manager-9f4696d94-4gdqf\" (UID: \"02d25e12-ca12-40f3-bc21-2b5a55fdba5d\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.647824 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.648593 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-ccdvk" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.658174 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.669333 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.670119 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fzkm\" (UniqueName: \"kubernetes.io/projected/0418f548-554a-4efc-8494-4edc9d56fc7f-kube-api-access-2fzkm\") pod \"designate-operator-controller-manager-84f4f7b77b-ck4dv\" (UID: \"0418f548-554a-4efc-8494-4edc9d56fc7f\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.670369 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.675148 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttzrk\" (UniqueName: \"kubernetes.io/projected/99a6dbcc-0b05-4471-b2d4-acacf72f6ff0-kube-api-access-ttzrk\") pod \"barbican-operator-controller-manager-6ff8b75857-vvxcs\" (UID: \"99a6dbcc-0b05-4471-b2d4-acacf72f6ff0\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.678498 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j944\" (UniqueName: \"kubernetes.io/projected/4a7c1ef4-c125-445b-9f1e-b24ee27e2938-kube-api-access-7j944\") pod \"cinder-operator-controller-manager-644bddb6d8-58njn\" (UID: \"4a7c1ef4-c125-445b-9f1e-b24ee27e2938\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.678619 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-wzn29" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.698286 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.711349 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-8898x"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.712663 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.715513 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-nwksj" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.731683 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-8898x"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.738344 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.739369 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.741030 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-24t4n" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.750293 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f20ab81-68d3-4973-9336-d00440b811f9-cert\") pod \"infra-operator-controller-manager-9d6c5db85-xk2zl\" (UID: \"8f20ab81-68d3-4973-9336-d00440b811f9\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.750329 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcmfp\" (UniqueName: \"kubernetes.io/projected/8f20ab81-68d3-4973-9336-d00440b811f9-kube-api-access-zcmfp\") pod \"infra-operator-controller-manager-9d6c5db85-xk2zl\" (UID: \"8f20ab81-68d3-4973-9336-d00440b811f9\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.750384 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nt5m\" (UniqueName: \"kubernetes.io/projected/6647f74c-8bbb-490d-ade7-8b2fb5469ddc-kube-api-access-4nt5m\") pod \"glance-operator-controller-manager-84958c4d49-6w5m5\" (UID: \"6647f74c-8bbb-490d-ade7-8b2fb5469ddc\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.750402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vrhz\" (UniqueName: \"kubernetes.io/projected/11409643-1cee-49c5-b3d0-fa1ec4cb1af0-kube-api-access-9vrhz\") pod \"heat-operator-controller-manager-5d889d78cf-h22cx\" (UID: \"11409643-1cee-49c5-b3d0-fa1ec4cb1af0\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.750417 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfrfk\" (UniqueName: \"kubernetes.io/projected/02d25e12-ca12-40f3-bc21-2b5a55fdba5d-kube-api-access-jfrfk\") pod \"horizon-operator-controller-manager-9f4696d94-4gdqf\" (UID: \"02d25e12-ca12-40f3-bc21-2b5a55fdba5d\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.750474 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hb47\" (UniqueName: \"kubernetes.io/projected/00534b7e-41c7-4935-8349-78aee327867e-kube-api-access-4hb47\") pod \"keystone-operator-controller-manager-5bd55b4bff-292hc\" (UID: \"00534b7e-41c7-4935-8349-78aee327867e\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.750507 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqfsq\" (UniqueName: \"kubernetes.io/projected/40a29698-620f-45d7-b630-0cfe188dd09f-kube-api-access-dqfsq\") pod \"ironic-operator-controller-manager-5cd4858477-bhn8n\" (UID: \"40a29698-620f-45d7-b630-0cfe188dd09f\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.751421 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.756531 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.757426 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.757614 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.761732 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-gjhdk" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.763976 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.767046 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.772281 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.772940 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.775666 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.775778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfrfk\" (UniqueName: \"kubernetes.io/projected/02d25e12-ca12-40f3-bc21-2b5a55fdba5d-kube-api-access-jfrfk\") pod \"horizon-operator-controller-manager-9f4696d94-4gdqf\" (UID: \"02d25e12-ca12-40f3-bc21-2b5a55fdba5d\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.776208 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.801812 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nt5m\" (UniqueName: \"kubernetes.io/projected/6647f74c-8bbb-490d-ade7-8b2fb5469ddc-kube-api-access-4nt5m\") pod \"glance-operator-controller-manager-84958c4d49-6w5m5\" (UID: \"6647f74c-8bbb-490d-ade7-8b2fb5469ddc\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.802515 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vrhz\" (UniqueName: \"kubernetes.io/projected/11409643-1cee-49c5-b3d0-fa1ec4cb1af0-kube-api-access-9vrhz\") pod \"heat-operator-controller-manager-5d889d78cf-h22cx\" (UID: \"11409643-1cee-49c5-b3d0-fa1ec4cb1af0\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.802685 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ztvqj" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.803004 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.803087 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-j6hpz" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.803272 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.803631 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.810820 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.819181 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.830480 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.833080 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-sw882" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.841889 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.845334 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.850674 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.856505 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.856768 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-t5wt7" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.857464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5grm\" (UniqueName: \"kubernetes.io/projected/bf6d3a96-3f74-44df-8e75-1865612d0303-kube-api-access-g5grm\") pod \"nova-operator-controller-manager-64cd67b5cb-jqvd5\" (UID: \"bf6d3a96-3f74-44df-8e75-1865612d0303\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.857497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hb47\" (UniqueName: \"kubernetes.io/projected/00534b7e-41c7-4935-8349-78aee327867e-kube-api-access-4hb47\") pod \"keystone-operator-controller-manager-5bd55b4bff-292hc\" (UID: \"00534b7e-41c7-4935-8349-78aee327867e\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.857523 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqfsq\" (UniqueName: \"kubernetes.io/projected/40a29698-620f-45d7-b630-0cfe188dd09f-kube-api-access-dqfsq\") pod \"ironic-operator-controller-manager-5cd4858477-bhn8n\" (UID: \"40a29698-620f-45d7-b630-0cfe188dd09f\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.857543 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f20ab81-68d3-4973-9336-d00440b811f9-cert\") pod \"infra-operator-controller-manager-9d6c5db85-xk2zl\" (UID: \"8f20ab81-68d3-4973-9336-d00440b811f9\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.857558 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcmfp\" (UniqueName: \"kubernetes.io/projected/8f20ab81-68d3-4973-9336-d00440b811f9-kube-api-access-zcmfp\") pod \"infra-operator-controller-manager-9d6c5db85-xk2zl\" (UID: \"8f20ab81-68d3-4973-9336-d00440b811f9\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.857595 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knjfd\" (UniqueName: \"kubernetes.io/projected/a26beb61-7189-40d0-9284-e58654887bbd-kube-api-access-knjfd\") pod \"neutron-operator-controller-manager-849d5b9b84-j4s7v\" (UID: \"a26beb61-7189-40d0-9284-e58654887bbd\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.857631 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj6vh\" (UniqueName: \"kubernetes.io/projected/3bb59228-30b8-42af-b24c-dc50224fde04-kube-api-access-nj6vh\") pod \"mariadb-operator-controller-manager-88c7-8898x\" (UID: \"3bb59228-30b8-42af-b24c-dc50224fde04\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.857651 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4rrn\" (UniqueName: \"kubernetes.io/projected/6a63cd00-64f1-42cf-8250-abc3dfc3a4ff-kube-api-access-x4rrn\") pod \"manila-operator-controller-manager-6d68dbc695-8z228\" (UID: \"6a63cd00-64f1-42cf-8250-abc3dfc3a4ff\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.863242 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f20ab81-68d3-4973-9336-d00440b811f9-cert\") pod \"infra-operator-controller-manager-9d6c5db85-xk2zl\" (UID: \"8f20ab81-68d3-4973-9336-d00440b811f9\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.871041 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.881772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hb47\" (UniqueName: \"kubernetes.io/projected/00534b7e-41c7-4935-8349-78aee327867e-kube-api-access-4hb47\") pod \"keystone-operator-controller-manager-5bd55b4bff-292hc\" (UID: \"00534b7e-41c7-4935-8349-78aee327867e\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.881876 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.882457 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqfsq\" (UniqueName: \"kubernetes.io/projected/40a29698-620f-45d7-b630-0cfe188dd09f-kube-api-access-dqfsq\") pod \"ironic-operator-controller-manager-5cd4858477-bhn8n\" (UID: \"40a29698-620f-45d7-b630-0cfe188dd09f\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.898690 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.899631 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.904279 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-cfnfl" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.911992 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcmfp\" (UniqueName: \"kubernetes.io/projected/8f20ab81-68d3-4973-9336-d00440b811f9-kube-api-access-zcmfp\") pod \"infra-operator-controller-manager-9d6c5db85-xk2zl\" (UID: \"8f20ab81-68d3-4973-9336-d00440b811f9\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.922964 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.929853 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.949875 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.958531 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-965jh\" (UniqueName: \"kubernetes.io/projected/b544aac2-b3f7-453e-a05b-58f22b1b4fe1-kube-api-access-965jh\") pod \"placement-operator-controller-manager-589c58c6c-tzpfm\" (UID: \"b544aac2-b3f7-453e-a05b-58f22b1b4fe1\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.958606 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvsz\" (UniqueName: \"kubernetes.io/projected/557ca73b-a87b-4d42-8d86-dfbd057ae1fd-kube-api-access-wkvsz\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cg66wx\" (UID: \"557ca73b-a87b-4d42-8d86-dfbd057ae1fd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.958653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knjfd\" (UniqueName: \"kubernetes.io/projected/a26beb61-7189-40d0-9284-e58654887bbd-kube-api-access-knjfd\") pod \"neutron-operator-controller-manager-849d5b9b84-j4s7v\" (UID: \"a26beb61-7189-40d0-9284-e58654887bbd\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.958674 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpspj\" (UniqueName: \"kubernetes.io/projected/ab9def29-b23e-4af8-828b-3c4151503a96-kube-api-access-rpspj\") pod \"octavia-operator-controller-manager-7b787867f4-cr9ww\" (UID: \"ab9def29-b23e-4af8-828b-3c4151503a96\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.958708 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/557ca73b-a87b-4d42-8d86-dfbd057ae1fd-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cg66wx\" (UID: \"557ca73b-a87b-4d42-8d86-dfbd057ae1fd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.958728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxwtc\" (UniqueName: \"kubernetes.io/projected/f7c97980-24c5-42e5-b60c-763bd31ad269-kube-api-access-xxwtc\") pod \"ovn-operator-controller-manager-9976ff44c-cwjsk\" (UID: \"f7c97980-24c5-42e5-b60c-763bd31ad269\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.958747 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj6vh\" (UniqueName: \"kubernetes.io/projected/3bb59228-30b8-42af-b24c-dc50224fde04-kube-api-access-nj6vh\") pod \"mariadb-operator-controller-manager-88c7-8898x\" (UID: \"3bb59228-30b8-42af-b24c-dc50224fde04\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.958765 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4rrn\" (UniqueName: \"kubernetes.io/projected/6a63cd00-64f1-42cf-8250-abc3dfc3a4ff-kube-api-access-x4rrn\") pod \"manila-operator-controller-manager-6d68dbc695-8z228\" (UID: \"6a63cd00-64f1-42cf-8250-abc3dfc3a4ff\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.958794 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5grm\" (UniqueName: \"kubernetes.io/projected/bf6d3a96-3f74-44df-8e75-1865612d0303-kube-api-access-g5grm\") pod \"nova-operator-controller-manager-64cd67b5cb-jqvd5\" (UID: \"bf6d3a96-3f74-44df-8e75-1865612d0303\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.964803 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.966021 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.973523 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.974576 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz"] Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.980394 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-w6h8g" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.985824 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4rrn\" (UniqueName: \"kubernetes.io/projected/6a63cd00-64f1-42cf-8250-abc3dfc3a4ff-kube-api-access-x4rrn\") pod \"manila-operator-controller-manager-6d68dbc695-8z228\" (UID: \"6a63cd00-64f1-42cf-8250-abc3dfc3a4ff\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.988320 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knjfd\" (UniqueName: \"kubernetes.io/projected/a26beb61-7189-40d0-9284-e58654887bbd-kube-api-access-knjfd\") pod \"neutron-operator-controller-manager-849d5b9b84-j4s7v\" (UID: \"a26beb61-7189-40d0-9284-e58654887bbd\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.988403 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5grm\" (UniqueName: \"kubernetes.io/projected/bf6d3a96-3f74-44df-8e75-1865612d0303-kube-api-access-g5grm\") pod \"nova-operator-controller-manager-64cd67b5cb-jqvd5\" (UID: \"bf6d3a96-3f74-44df-8e75-1865612d0303\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.996679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj6vh\" (UniqueName: \"kubernetes.io/projected/3bb59228-30b8-42af-b24c-dc50224fde04-kube-api-access-nj6vh\") pod \"mariadb-operator-controller-manager-88c7-8898x\" (UID: \"3bb59228-30b8-42af-b24c-dc50224fde04\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" Oct 01 13:22:15 crc kubenswrapper[4749]: I1001 13:22:15.999179 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-bdz5p"] Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.000381 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.003296 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-tdg6p" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.008874 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-bdz5p"] Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.038105 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9"] Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.039585 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9"] Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.041181 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.044259 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-z7xqd" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.060620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnn72\" (UniqueName: \"kubernetes.io/projected/c81045dc-62f0-4ae6-9e05-e26cc8f90611-kube-api-access-pnn72\") pod \"swift-operator-controller-manager-84d6b4b759-5qtjp\" (UID: \"c81045dc-62f0-4ae6-9e05-e26cc8f90611\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.060957 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/557ca73b-a87b-4d42-8d86-dfbd057ae1fd-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cg66wx\" (UID: \"557ca73b-a87b-4d42-8d86-dfbd057ae1fd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.061161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxwtc\" (UniqueName: \"kubernetes.io/projected/f7c97980-24c5-42e5-b60c-763bd31ad269-kube-api-access-xxwtc\") pod \"ovn-operator-controller-manager-9976ff44c-cwjsk\" (UID: \"f7c97980-24c5-42e5-b60c-763bd31ad269\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.061400 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jz2x\" (UniqueName: \"kubernetes.io/projected/765c8918-abbc-47dd-8960-18292d54a9a0-kube-api-access-6jz2x\") pod \"telemetry-operator-controller-manager-b8d54b5d7-9s6hz\" (UID: \"765c8918-abbc-47dd-8960-18292d54a9a0\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.061648 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-965jh\" (UniqueName: \"kubernetes.io/projected/b544aac2-b3f7-453e-a05b-58f22b1b4fe1-kube-api-access-965jh\") pod \"placement-operator-controller-manager-589c58c6c-tzpfm\" (UID: \"b544aac2-b3f7-453e-a05b-58f22b1b4fe1\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.061791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvsz\" (UniqueName: \"kubernetes.io/projected/557ca73b-a87b-4d42-8d86-dfbd057ae1fd-kube-api-access-wkvsz\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cg66wx\" (UID: \"557ca73b-a87b-4d42-8d86-dfbd057ae1fd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.061974 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpspj\" (UniqueName: \"kubernetes.io/projected/ab9def29-b23e-4af8-828b-3c4151503a96-kube-api-access-rpspj\") pod \"octavia-operator-controller-manager-7b787867f4-cr9ww\" (UID: \"ab9def29-b23e-4af8-828b-3c4151503a96\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" Oct 01 13:22:16 crc kubenswrapper[4749]: E1001 13:22:16.062236 4749 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 13:22:16 crc kubenswrapper[4749]: E1001 13:22:16.062599 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/557ca73b-a87b-4d42-8d86-dfbd057ae1fd-cert podName:557ca73b-a87b-4d42-8d86-dfbd057ae1fd nodeName:}" failed. No retries permitted until 2025-10-01 13:22:16.562546308 +0000 UTC m=+996.616531317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/557ca73b-a87b-4d42-8d86-dfbd057ae1fd-cert") pod "openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" (UID: "557ca73b-a87b-4d42-8d86-dfbd057ae1fd") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.087436 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8"] Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.090193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvsz\" (UniqueName: \"kubernetes.io/projected/557ca73b-a87b-4d42-8d86-dfbd057ae1fd-kube-api-access-wkvsz\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cg66wx\" (UID: \"557ca73b-a87b-4d42-8d86-dfbd057ae1fd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.091064 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.094574 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.095049 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-kx24s" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.097162 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpspj\" (UniqueName: \"kubernetes.io/projected/ab9def29-b23e-4af8-828b-3c4151503a96-kube-api-access-rpspj\") pod \"octavia-operator-controller-manager-7b787867f4-cr9ww\" (UID: \"ab9def29-b23e-4af8-828b-3c4151503a96\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.097192 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-965jh\" (UniqueName: \"kubernetes.io/projected/b544aac2-b3f7-453e-a05b-58f22b1b4fe1-kube-api-access-965jh\") pod \"placement-operator-controller-manager-589c58c6c-tzpfm\" (UID: \"b544aac2-b3f7-453e-a05b-58f22b1b4fe1\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.106681 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8"] Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.110954 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxwtc\" (UniqueName: \"kubernetes.io/projected/f7c97980-24c5-42e5-b60c-763bd31ad269-kube-api-access-xxwtc\") pod \"ovn-operator-controller-manager-9976ff44c-cwjsk\" (UID: \"f7c97980-24c5-42e5-b60c-763bd31ad269\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.141990 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.145590 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.156517 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.158689 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.165639 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jz2x\" (UniqueName: \"kubernetes.io/projected/765c8918-abbc-47dd-8960-18292d54a9a0-kube-api-access-6jz2x\") pod \"telemetry-operator-controller-manager-b8d54b5d7-9s6hz\" (UID: \"765c8918-abbc-47dd-8960-18292d54a9a0\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.165679 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86nxh\" (UniqueName: \"kubernetes.io/projected/1d38e134-a674-403f-9a4b-4ee8de1fe763-kube-api-access-86nxh\") pod \"watcher-operator-controller-manager-56f5865b8b-4g6n9\" (UID: \"1d38e134-a674-403f-9a4b-4ee8de1fe763\") " pod="openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.165795 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qqkd\" (UniqueName: \"kubernetes.io/projected/acf0c68b-d009-4f2f-a21e-a2573842a063-kube-api-access-2qqkd\") pod \"test-operator-controller-manager-85777745bb-bdz5p\" (UID: \"acf0c68b-d009-4f2f-a21e-a2573842a063\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.165827 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnn72\" (UniqueName: \"kubernetes.io/projected/c81045dc-62f0-4ae6-9e05-e26cc8f90611-kube-api-access-pnn72\") pod \"swift-operator-controller-manager-84d6b4b759-5qtjp\" (UID: \"c81045dc-62f0-4ae6-9e05-e26cc8f90611\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.188153 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnn72\" (UniqueName: \"kubernetes.io/projected/c81045dc-62f0-4ae6-9e05-e26cc8f90611-kube-api-access-pnn72\") pod \"swift-operator-controller-manager-84d6b4b759-5qtjp\" (UID: \"c81045dc-62f0-4ae6-9e05-e26cc8f90611\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.189183 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.191396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.196199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jz2x\" (UniqueName: \"kubernetes.io/projected/765c8918-abbc-47dd-8960-18292d54a9a0-kube-api-access-6jz2x\") pod \"telemetry-operator-controller-manager-b8d54b5d7-9s6hz\" (UID: \"765c8918-abbc-47dd-8960-18292d54a9a0\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.212787 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.216314 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.218488 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.220035 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl"] Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.220991 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.226206 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-zv8z2" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.226934 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.247408 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl"] Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.271423 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86nxh\" (UniqueName: \"kubernetes.io/projected/1d38e134-a674-403f-9a4b-4ee8de1fe763-kube-api-access-86nxh\") pod \"watcher-operator-controller-manager-56f5865b8b-4g6n9\" (UID: \"1d38e134-a674-403f-9a4b-4ee8de1fe763\") " pod="openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.271492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aac26fb-f511-4491-a239-7c5f7ced5f43-cert\") pod \"openstack-operator-controller-manager-5fc59ccd99-c2sc8\" (UID: \"9aac26fb-f511-4491-a239-7c5f7ced5f43\") " pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.271594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qqkd\" (UniqueName: \"kubernetes.io/projected/acf0c68b-d009-4f2f-a21e-a2573842a063-kube-api-access-2qqkd\") pod \"test-operator-controller-manager-85777745bb-bdz5p\" (UID: \"acf0c68b-d009-4f2f-a21e-a2573842a063\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.271629 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v94p\" (UniqueName: \"kubernetes.io/projected/9aac26fb-f511-4491-a239-7c5f7ced5f43-kube-api-access-6v94p\") pod \"openstack-operator-controller-manager-5fc59ccd99-c2sc8\" (UID: \"9aac26fb-f511-4491-a239-7c5f7ced5f43\") " pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.302657 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86nxh\" (UniqueName: \"kubernetes.io/projected/1d38e134-a674-403f-9a4b-4ee8de1fe763-kube-api-access-86nxh\") pod \"watcher-operator-controller-manager-56f5865b8b-4g6n9\" (UID: \"1d38e134-a674-403f-9a4b-4ee8de1fe763\") " pod="openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.312355 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qqkd\" (UniqueName: \"kubernetes.io/projected/acf0c68b-d009-4f2f-a21e-a2573842a063-kube-api-access-2qqkd\") pod \"test-operator-controller-manager-85777745bb-bdz5p\" (UID: \"acf0c68b-d009-4f2f-a21e-a2573842a063\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.372934 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v94p\" (UniqueName: \"kubernetes.io/projected/9aac26fb-f511-4491-a239-7c5f7ced5f43-kube-api-access-6v94p\") pod \"openstack-operator-controller-manager-5fc59ccd99-c2sc8\" (UID: \"9aac26fb-f511-4491-a239-7c5f7ced5f43\") " pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.372974 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b66bp\" (UniqueName: \"kubernetes.io/projected/33e04d57-8c80-4ed0-b05b-1edf290d476e-kube-api-access-b66bp\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-x26nl\" (UID: \"33e04d57-8c80-4ed0-b05b-1edf290d476e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.373052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aac26fb-f511-4491-a239-7c5f7ced5f43-cert\") pod \"openstack-operator-controller-manager-5fc59ccd99-c2sc8\" (UID: \"9aac26fb-f511-4491-a239-7c5f7ced5f43\") " pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" Oct 01 13:22:16 crc kubenswrapper[4749]: E1001 13:22:16.373162 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 01 13:22:16 crc kubenswrapper[4749]: E1001 13:22:16.373205 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9aac26fb-f511-4491-a239-7c5f7ced5f43-cert podName:9aac26fb-f511-4491-a239-7c5f7ced5f43 nodeName:}" failed. No retries permitted until 2025-10-01 13:22:16.873191929 +0000 UTC m=+996.927176828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9aac26fb-f511-4491-a239-7c5f7ced5f43-cert") pod "openstack-operator-controller-manager-5fc59ccd99-c2sc8" (UID: "9aac26fb-f511-4491-a239-7c5f7ced5f43") : secret "webhook-server-cert" not found Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.397537 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v94p\" (UniqueName: \"kubernetes.io/projected/9aac26fb-f511-4491-a239-7c5f7ced5f43-kube-api-access-6v94p\") pod \"openstack-operator-controller-manager-5fc59ccd99-c2sc8\" (UID: \"9aac26fb-f511-4491-a239-7c5f7ced5f43\") " pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.475753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b66bp\" (UniqueName: \"kubernetes.io/projected/33e04d57-8c80-4ed0-b05b-1edf290d476e-kube-api-access-b66bp\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-x26nl\" (UID: \"33e04d57-8c80-4ed0-b05b-1edf290d476e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.500866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b66bp\" (UniqueName: \"kubernetes.io/projected/33e04d57-8c80-4ed0-b05b-1edf290d476e-kube-api-access-b66bp\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-x26nl\" (UID: \"33e04d57-8c80-4ed0-b05b-1edf290d476e\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.533363 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn"] Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.551800 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.555380 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.562362 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs"] Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.577473 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/557ca73b-a87b-4d42-8d86-dfbd057ae1fd-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cg66wx\" (UID: \"557ca73b-a87b-4d42-8d86-dfbd057ae1fd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.584288 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/557ca73b-a87b-4d42-8d86-dfbd057ae1fd-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8cg66wx\" (UID: \"557ca73b-a87b-4d42-8d86-dfbd057ae1fd\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.589999 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.639923 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.815019 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5"] Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.823676 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv"] Oct 01 13:22:16 crc kubenswrapper[4749]: I1001 13:22:16.881106 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aac26fb-f511-4491-a239-7c5f7ced5f43-cert\") pod \"openstack-operator-controller-manager-5fc59ccd99-c2sc8\" (UID: \"9aac26fb-f511-4491-a239-7c5f7ced5f43\") " pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" Oct 01 13:22:16 crc kubenswrapper[4749]: E1001 13:22:16.881244 4749 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 01 13:22:16 crc kubenswrapper[4749]: E1001 13:22:16.881291 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9aac26fb-f511-4491-a239-7c5f7ced5f43-cert podName:9aac26fb-f511-4491-a239-7c5f7ced5f43 nodeName:}" failed. No retries permitted until 2025-10-01 13:22:17.881277728 +0000 UTC m=+997.935262627 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9aac26fb-f511-4491-a239-7c5f7ced5f43-cert") pod "openstack-operator-controller-manager-5fc59ccd99-c2sc8" (UID: "9aac26fb-f511-4491-a239-7c5f7ced5f43") : secret "webhook-server-cert" not found Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.005746 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn" event={"ID":"4a7c1ef4-c125-445b-9f1e-b24ee27e2938","Type":"ContainerStarted","Data":"ed9544af95b00154f3915d9d6f87fb6187924efd534df11fe8f5b2ce833c93a5"} Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.008164 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5" event={"ID":"6647f74c-8bbb-490d-ade7-8b2fb5469ddc","Type":"ContainerStarted","Data":"b58e8edb6d32a8d6211529aa1ffed6ddee43ca6e0e986e149b7d04383e911d34"} Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.009823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs" event={"ID":"99a6dbcc-0b05-4471-b2d4-acacf72f6ff0","Type":"ContainerStarted","Data":"dbce9e8a0b346a0678f92a047873af214996c268b4ec68f5610a80641c231fa3"} Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.011873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv" event={"ID":"0418f548-554a-4efc-8494-4edc9d56fc7f","Type":"ContainerStarted","Data":"5c436e24e635f7e0ae28e2b85653f0a4f8cf3d4d6b035f9526399375c0ad262b"} Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.178146 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc"] Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.207517 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf"] Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.218495 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n"] Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.235597 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx"] Oct 01 13:22:17 crc kubenswrapper[4749]: W1001 13:22:17.250414 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40a29698_620f_45d7_b630_0cfe188dd09f.slice/crio-1d931f3ad7d25fd442696f47931946251061b36f7452ad66a78ead450deec0c4 WatchSource:0}: Error finding container 1d931f3ad7d25fd442696f47931946251061b36f7452ad66a78ead450deec0c4: Status 404 returned error can't find the container with id 1d931f3ad7d25fd442696f47931946251061b36f7452ad66a78ead450deec0c4 Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.262519 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228"] Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.262554 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v"] Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.265937 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm"] Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.278502 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5"] Oct 01 13:22:17 crc kubenswrapper[4749]: W1001 13:22:17.279005 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda26beb61_7189_40d0_9284_e58654887bbd.slice/crio-f6de18a6a8a7c9ee3e259ca24153d7de51770c5686b3fbde8b10f593ead91dfc WatchSource:0}: Error finding container f6de18a6a8a7c9ee3e259ca24153d7de51770c5686b3fbde8b10f593ead91dfc: Status 404 returned error can't find the container with id f6de18a6a8a7c9ee3e259ca24153d7de51770c5686b3fbde8b10f593ead91dfc Oct 01 13:22:17 crc kubenswrapper[4749]: W1001 13:22:17.283191 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc81045dc_62f0_4ae6_9e05_e26cc8f90611.slice/crio-12bf43e34e937b0e026e77299bc73026ddef9e5d29cbc7dbde420bc5a7c8ac53 WatchSource:0}: Error finding container 12bf43e34e937b0e026e77299bc73026ddef9e5d29cbc7dbde420bc5a7c8ac53: Status 404 returned error can't find the container with id 12bf43e34e937b0e026e77299bc73026ddef9e5d29cbc7dbde420bc5a7c8ac53 Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.291039 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-8898x"] Oct 01 13:22:17 crc kubenswrapper[4749]: W1001 13:22:17.292106 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7c97980_24c5_42e5_b60c_763bd31ad269.slice/crio-d40f57d1bc92d45a6457440dd1e84d682dd6e85894bca240721e1bf685a05aa8 WatchSource:0}: Error finding container d40f57d1bc92d45a6457440dd1e84d682dd6e85894bca240721e1bf685a05aa8: Status 404 returned error can't find the container with id d40f57d1bc92d45a6457440dd1e84d682dd6e85894bca240721e1bf685a05aa8 Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.298317 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp"] Oct 01 13:22:17 crc kubenswrapper[4749]: W1001 13:22:17.298519 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb59228_30b8_42af_b24c_dc50224fde04.slice/crio-abb6152f10661fa90113c6b888676c237d9e7d9dcc166757b94460f75b82dcbd WatchSource:0}: Error finding container abb6152f10661fa90113c6b888676c237d9e7d9dcc166757b94460f75b82dcbd: Status 404 returned error can't find the container with id abb6152f10661fa90113c6b888676c237d9e7d9dcc166757b94460f75b82dcbd Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.301080 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nj6vh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-88c7-8898x_openstack-operators(3bb59228-30b8-42af-b24c-dc50224fde04): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.302604 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk"] Oct 01 13:22:17 crc kubenswrapper[4749]: W1001 13:22:17.305301 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf6d3a96_3f74_44df_8e75_1865612d0303.slice/crio-98f702b4d34e5307543baf2496def0774eae63842939219aa3f84330ca06b459 WatchSource:0}: Error finding container 98f702b4d34e5307543baf2496def0774eae63842939219aa3f84330ca06b459: Status 404 returned error can't find the container with id 98f702b4d34e5307543baf2496def0774eae63842939219aa3f84330ca06b459 Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.318524 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g5grm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-64cd67b5cb-jqvd5_openstack-operators(bf6d3a96-3f74-44df-8e75-1865612d0303): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.410041 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9"] Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.413286 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-bdz5p"] Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.416580 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww"] Oct 01 13:22:17 crc kubenswrapper[4749]: W1001 13:22:17.429531 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab9def29_b23e_4af8_828b_3c4151503a96.slice/crio-5d78825802f0671c1b41814081db100d1e075e15a4443c3437f32f76b5c90707 WatchSource:0}: Error finding container 5d78825802f0671c1b41814081db100d1e075e15a4443c3437f32f76b5c90707: Status 404 returned error can't find the container with id 5d78825802f0671c1b41814081db100d1e075e15a4443c3437f32f76b5c90707 Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.438012 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl"] Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.439933 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rpspj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7b787867f4-cr9ww_openstack-operators(ab9def29-b23e-4af8-828b-3c4151503a96): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.447251 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz"] Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.453875 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2qqkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-85777745bb-bdz5p_openstack-operators(acf0c68b-d009-4f2f-a21e-a2573842a063): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:22:17 crc kubenswrapper[4749]: W1001 13:22:17.460473 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod765c8918_abbc_47dd_8960_18292d54a9a0.slice/crio-08d9c8daec10f7848a84a46c92fd3a25ae96a367068b205c2cdb5e466bfcf2ae WatchSource:0}: Error finding container 08d9c8daec10f7848a84a46c92fd3a25ae96a367068b205c2cdb5e466bfcf2ae: Status 404 returned error can't find the container with id 08d9c8daec10f7848a84a46c92fd3a25ae96a367068b205c2cdb5e466bfcf2ae Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.464074 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6jz2x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-b8d54b5d7-9s6hz_openstack-operators(765c8918-abbc-47dd-8960-18292d54a9a0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:22:17 crc kubenswrapper[4749]: W1001 13:22:17.464706 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e04d57_8c80_4ed0_b05b_1edf290d476e.slice/crio-59f43b4473ca52d85d032b578adc21c480e9cd85cf0295f4b7c8a95afc43775a WatchSource:0}: Error finding container 59f43b4473ca52d85d032b578adc21c480e9cd85cf0295f4b7c8a95afc43775a: Status 404 returned error can't find the container with id 59f43b4473ca52d85d032b578adc21c480e9cd85cf0295f4b7c8a95afc43775a Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.468051 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl"] Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.469561 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" podUID="3bb59228-30b8-42af-b24c-dc50224fde04" Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.478208 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b66bp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-x26nl_openstack-operators(33e04d57-8c80-4ed0-b05b-1edf290d476e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.480558 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl" podUID="33e04d57-8c80-4ed0-b05b-1edf290d476e" Oct 01 13:22:17 crc kubenswrapper[4749]: W1001 13:22:17.482949 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f20ab81_68d3_4973_9336_d00440b811f9.slice/crio-79952a0a8421ee1b0b2b001955bb7f36b91caf54d15cf0d36c3e2ca6382e1470 WatchSource:0}: Error finding container 79952a0a8421ee1b0b2b001955bb7f36b91caf54d15cf0d36c3e2ca6382e1470: Status 404 returned error can't find the container with id 79952a0a8421ee1b0b2b001955bb7f36b91caf54d15cf0d36c3e2ca6382e1470 Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.493186 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zcmfp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-9d6c5db85-xk2zl_openstack-operators(8f20ab81-68d3-4973-9336-d00440b811f9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.497978 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" podUID="bf6d3a96-3f74-44df-8e75-1865612d0303" Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.504007 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx"] Oct 01 13:22:17 crc kubenswrapper[4749]: W1001 13:22:17.509664 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod557ca73b_a87b_4d42_8d86_dfbd057ae1fd.slice/crio-089b220d19b2282be516498cbadc1781e2c88473c48366ac09201ec8cd922aff WatchSource:0}: Error finding container 089b220d19b2282be516498cbadc1781e2c88473c48366ac09201ec8cd922aff: Status 404 returned error can't find the container with id 089b220d19b2282be516498cbadc1781e2c88473c48366ac09201ec8cd922aff Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.518946 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wkvsz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-77b9676b8cg66wx_openstack-operators(557ca73b-a87b-4d42-8d86-dfbd057ae1fd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.628499 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" podUID="acf0c68b-d009-4f2f-a21e-a2573842a063" Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.632737 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" podUID="765c8918-abbc-47dd-8960-18292d54a9a0" Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.635884 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" podUID="ab9def29-b23e-4af8-828b-3c4151503a96" Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.659660 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" podUID="8f20ab81-68d3-4973-9336-d00440b811f9" Oct 01 13:22:17 crc kubenswrapper[4749]: E1001 13:22:17.755555 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" podUID="557ca73b-a87b-4d42-8d86-dfbd057ae1fd" Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.896420 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aac26fb-f511-4491-a239-7c5f7ced5f43-cert\") pod \"openstack-operator-controller-manager-5fc59ccd99-c2sc8\" (UID: \"9aac26fb-f511-4491-a239-7c5f7ced5f43\") " pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" Oct 01 13:22:17 crc kubenswrapper[4749]: I1001 13:22:17.901487 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9aac26fb-f511-4491-a239-7c5f7ced5f43-cert\") pod \"openstack-operator-controller-manager-5fc59ccd99-c2sc8\" (UID: \"9aac26fb-f511-4491-a239-7c5f7ced5f43\") " pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.020945 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" event={"ID":"8f20ab81-68d3-4973-9336-d00440b811f9","Type":"ContainerStarted","Data":"a57a6d271d9905529a2cd64a8a169c3f495b99c168921d426a8e127c8a0387f0"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.021003 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" event={"ID":"8f20ab81-68d3-4973-9336-d00440b811f9","Type":"ContainerStarted","Data":"79952a0a8421ee1b0b2b001955bb7f36b91caf54d15cf0d36c3e2ca6382e1470"} Oct 01 13:22:18 crc kubenswrapper[4749]: E1001 13:22:18.022246 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" podUID="8f20ab81-68d3-4973-9336-d00440b811f9" Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.022906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n" event={"ID":"40a29698-620f-45d7-b630-0cfe188dd09f","Type":"ContainerStarted","Data":"1d931f3ad7d25fd442696f47931946251061b36f7452ad66a78ead450deec0c4"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.024771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" event={"ID":"ab9def29-b23e-4af8-828b-3c4151503a96","Type":"ContainerStarted","Data":"c4c98596b2a6f310661397870332fd5c621980a43ed379038b2aa2ec6d1580d1"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.024816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" event={"ID":"ab9def29-b23e-4af8-828b-3c4151503a96","Type":"ContainerStarted","Data":"5d78825802f0671c1b41814081db100d1e075e15a4443c3437f32f76b5c90707"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.026013 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk" event={"ID":"f7c97980-24c5-42e5-b60c-763bd31ad269","Type":"ContainerStarted","Data":"d40f57d1bc92d45a6457440dd1e84d682dd6e85894bca240721e1bf685a05aa8"} Oct 01 13:22:18 crc kubenswrapper[4749]: E1001 13:22:18.026281 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" podUID="ab9def29-b23e-4af8-828b-3c4151503a96" Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.027243 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl" event={"ID":"33e04d57-8c80-4ed0-b05b-1edf290d476e","Type":"ContainerStarted","Data":"59f43b4473ca52d85d032b578adc21c480e9cd85cf0295f4b7c8a95afc43775a"} Oct 01 13:22:18 crc kubenswrapper[4749]: E1001 13:22:18.028084 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl" podUID="33e04d57-8c80-4ed0-b05b-1edf290d476e" Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.028755 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9" event={"ID":"1d38e134-a674-403f-9a4b-4ee8de1fe763","Type":"ContainerStarted","Data":"537c7cc1bf68093af738506be9062e3786224376310e4443247b994648e8b2c2"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.030146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228" event={"ID":"6a63cd00-64f1-42cf-8250-abc3dfc3a4ff","Type":"ContainerStarted","Data":"4bb306bff3d465bc37104864c6677444b8d696acc0ac463ffe35fe29a580e850"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.031489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v" event={"ID":"a26beb61-7189-40d0-9284-e58654887bbd","Type":"ContainerStarted","Data":"f6de18a6a8a7c9ee3e259ca24153d7de51770c5686b3fbde8b10f593ead91dfc"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.032246 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf" event={"ID":"02d25e12-ca12-40f3-bc21-2b5a55fdba5d","Type":"ContainerStarted","Data":"46d936cf2587193f2c157d4f493f7dc65b036fa2bccd2bc5afbb70ac40585f0c"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.033833 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" event={"ID":"acf0c68b-d009-4f2f-a21e-a2573842a063","Type":"ContainerStarted","Data":"70fd7b4c3729f570f48390d5795fc4ee95a2cec0e898cce87bfd1f211a37c7f3"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.033872 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" event={"ID":"acf0c68b-d009-4f2f-a21e-a2573842a063","Type":"ContainerStarted","Data":"95f767d4295d693f0d0a20349195dca5ce9cad78877459fee9ad7a224ded6142"} Oct 01 13:22:18 crc kubenswrapper[4749]: E1001 13:22:18.050576 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" podUID="acf0c68b-d009-4f2f-a21e-a2573842a063" Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.066816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm" event={"ID":"b544aac2-b3f7-453e-a05b-58f22b1b4fe1","Type":"ContainerStarted","Data":"9f1586e0e2b9e7b17ae7729b80e3c9350082b410c2c0718c8573bd7fc0e4a63d"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.073714 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.080949 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" event={"ID":"3bb59228-30b8-42af-b24c-dc50224fde04","Type":"ContainerStarted","Data":"1ff68132bf57e55dd8652593da4bde11874536fe77a3f2a5b41c18eed6dc50a6"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.080978 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" event={"ID":"3bb59228-30b8-42af-b24c-dc50224fde04","Type":"ContainerStarted","Data":"abb6152f10661fa90113c6b888676c237d9e7d9dcc166757b94460f75b82dcbd"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.087186 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx" event={"ID":"11409643-1cee-49c5-b3d0-fa1ec4cb1af0","Type":"ContainerStarted","Data":"49e4deb4ff24feb2e95577f64e42fe27340fad11b6814ca53c58714910967cec"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.089654 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp" event={"ID":"c81045dc-62f0-4ae6-9e05-e26cc8f90611","Type":"ContainerStarted","Data":"12bf43e34e937b0e026e77299bc73026ddef9e5d29cbc7dbde420bc5a7c8ac53"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.090915 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" event={"ID":"bf6d3a96-3f74-44df-8e75-1865612d0303","Type":"ContainerStarted","Data":"92acf3fbb826bb054b3351c44da19ae845903c9fec3d272ea032579d06974692"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.090942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" event={"ID":"bf6d3a96-3f74-44df-8e75-1865612d0303","Type":"ContainerStarted","Data":"98f702b4d34e5307543baf2496def0774eae63842939219aa3f84330ca06b459"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.108041 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" event={"ID":"557ca73b-a87b-4d42-8d86-dfbd057ae1fd","Type":"ContainerStarted","Data":"bdd9d5589b4c9fb9aae53a4365e941162a799e883eb4c191ecb7cb7045771d4a"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.109614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" event={"ID":"557ca73b-a87b-4d42-8d86-dfbd057ae1fd","Type":"ContainerStarted","Data":"089b220d19b2282be516498cbadc1781e2c88473c48366ac09201ec8cd922aff"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.119337 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc" event={"ID":"00534b7e-41c7-4935-8349-78aee327867e","Type":"ContainerStarted","Data":"84f8a730b54d2531d5c4696c5a5b7c7e86f41894e673b504d68f44adb78d9ec1"} Oct 01 13:22:18 crc kubenswrapper[4749]: E1001 13:22:18.120190 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" podUID="bf6d3a96-3f74-44df-8e75-1865612d0303" Oct 01 13:22:18 crc kubenswrapper[4749]: E1001 13:22:18.120307 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" podUID="3bb59228-30b8-42af-b24c-dc50224fde04" Oct 01 13:22:18 crc kubenswrapper[4749]: E1001 13:22:18.120468 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" podUID="557ca73b-a87b-4d42-8d86-dfbd057ae1fd" Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.122927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" event={"ID":"765c8918-abbc-47dd-8960-18292d54a9a0","Type":"ContainerStarted","Data":"2f329434c9b86c8041baf655e54aa7d0832740c4a3e2c5632c657c7128643f82"} Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.122968 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" event={"ID":"765c8918-abbc-47dd-8960-18292d54a9a0","Type":"ContainerStarted","Data":"08d9c8daec10f7848a84a46c92fd3a25ae96a367068b205c2cdb5e466bfcf2ae"} Oct 01 13:22:18 crc kubenswrapper[4749]: E1001 13:22:18.128570 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" podUID="765c8918-abbc-47dd-8960-18292d54a9a0" Oct 01 13:22:18 crc kubenswrapper[4749]: I1001 13:22:18.680301 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8"] Oct 01 13:22:19 crc kubenswrapper[4749]: I1001 13:22:19.131814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" event={"ID":"9aac26fb-f511-4491-a239-7c5f7ced5f43","Type":"ContainerStarted","Data":"84499daac0831c83aa4228f40b4ef15e2bf0d61d09b04573bdcf94a65b8c55da"} Oct 01 13:22:19 crc kubenswrapper[4749]: I1001 13:22:19.132150 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" event={"ID":"9aac26fb-f511-4491-a239-7c5f7ced5f43","Type":"ContainerStarted","Data":"16837aacd0045ecb569266acc6b09eee8f022106f38814985b286fa0f2358809"} Oct 01 13:22:19 crc kubenswrapper[4749]: E1001 13:22:19.133844 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" podUID="8f20ab81-68d3-4973-9336-d00440b811f9" Oct 01 13:22:19 crc kubenswrapper[4749]: E1001 13:22:19.134356 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" podUID="765c8918-abbc-47dd-8960-18292d54a9a0" Oct 01 13:22:19 crc kubenswrapper[4749]: E1001 13:22:19.134797 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" podUID="acf0c68b-d009-4f2f-a21e-a2573842a063" Oct 01 13:22:19 crc kubenswrapper[4749]: E1001 13:22:19.134987 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" podUID="557ca73b-a87b-4d42-8d86-dfbd057ae1fd" Oct 01 13:22:19 crc kubenswrapper[4749]: E1001 13:22:19.135031 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" podUID="bf6d3a96-3f74-44df-8e75-1865612d0303" Oct 01 13:22:19 crc kubenswrapper[4749]: E1001 13:22:19.135118 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" podUID="ab9def29-b23e-4af8-828b-3c4151503a96" Oct 01 13:22:19 crc kubenswrapper[4749]: E1001 13:22:19.137076 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl" podUID="33e04d57-8c80-4ed0-b05b-1edf290d476e" Oct 01 13:22:19 crc kubenswrapper[4749]: E1001 13:22:19.140275 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" podUID="3bb59228-30b8-42af-b24c-dc50224fde04" Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.196559 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk" event={"ID":"f7c97980-24c5-42e5-b60c-763bd31ad269","Type":"ContainerStarted","Data":"0c9bb1750226049745b4b3a10f85932f36d813dacde4e1eee8c821740242863f"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.196967 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk" Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.196978 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk" event={"ID":"f7c97980-24c5-42e5-b60c-763bd31ad269","Type":"ContainerStarted","Data":"c2ea16c02a32c1324f7aea051b39f52ae0f67d559061988e46ee095f7cea933d"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.201856 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs" event={"ID":"99a6dbcc-0b05-4471-b2d4-acacf72f6ff0","Type":"ContainerStarted","Data":"e5e9229a51c3ea7e3b99ee6d33d34fe91bbe1ab230689ee0fdd47d0f88ee6cd8"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.204403 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5" event={"ID":"6647f74c-8bbb-490d-ade7-8b2fb5469ddc","Type":"ContainerStarted","Data":"9719afc597f04479619bd304319f2f67018eabf1b04104e7512c26dc479e2a93"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.209991 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9" event={"ID":"1d38e134-a674-403f-9a4b-4ee8de1fe763","Type":"ContainerStarted","Data":"723bd545841e083b80f6ede7b7b61d706398d4db9650d8eec860786081992637"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.226854 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn" event={"ID":"4a7c1ef4-c125-445b-9f1e-b24ee27e2938","Type":"ContainerStarted","Data":"094282d6fcd16b6e0e80511d144f7a74faae173ea38ec09a3efe57705eff8820"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.226900 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn" event={"ID":"4a7c1ef4-c125-445b-9f1e-b24ee27e2938","Type":"ContainerStarted","Data":"a03687ac070a59335aaa45d5f1462cd741d620790dbda8a88705b5c753b4de7c"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.227566 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn" Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.258716 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v" event={"ID":"a26beb61-7189-40d0-9284-e58654887bbd","Type":"ContainerStarted","Data":"336446a230f1c23f4dcef447bd59342ab7c4ab69771388c4f7c3d0956002dd27"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.264104 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm" event={"ID":"b544aac2-b3f7-453e-a05b-58f22b1b4fe1","Type":"ContainerStarted","Data":"3a1651d14f1940834e592cdc51a5000ca21b94ddb4468dedfbd0359d0b219d5b"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.289180 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk" podStartSLOduration=3.612721741 podStartE2EDuration="12.289164456s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.298627406 +0000 UTC m=+997.352612305" lastFinishedPulling="2025-10-01 13:22:25.975070111 +0000 UTC m=+1006.029055020" observedRunningTime="2025-10-01 13:22:27.259675324 +0000 UTC m=+1007.313660223" watchObservedRunningTime="2025-10-01 13:22:27.289164456 +0000 UTC m=+1007.343149355" Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.290490 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn" podStartSLOduration=2.9237535770000003 podStartE2EDuration="12.290484734s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:16.600028566 +0000 UTC m=+996.654013465" lastFinishedPulling="2025-10-01 13:22:25.966759713 +0000 UTC m=+1006.020744622" observedRunningTime="2025-10-01 13:22:27.287365785 +0000 UTC m=+1007.341350684" watchObservedRunningTime="2025-10-01 13:22:27.290484734 +0000 UTC m=+1007.344469633" Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.297161 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n" event={"ID":"40a29698-620f-45d7-b630-0cfe188dd09f","Type":"ContainerStarted","Data":"9314cc2dbb406eb9b146e25970c516c39f85313f0a9cd4142758baa119f18b5f"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.315307 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc" event={"ID":"00534b7e-41c7-4935-8349-78aee327867e","Type":"ContainerStarted","Data":"2553d36499229e4e02f7b2f1ef0ff396db4523237e3744812c90611101755b95"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.332700 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf" event={"ID":"02d25e12-ca12-40f3-bc21-2b5a55fdba5d","Type":"ContainerStarted","Data":"3a75c0f51d5ba3c544a03751616d9d4ef10a37f13e0b994be2d0469b249b6c4f"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.334027 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx" event={"ID":"11409643-1cee-49c5-b3d0-fa1ec4cb1af0","Type":"ContainerStarted","Data":"fe6413a8349c6e8d8eda15c772a576a33f81ea21cd21a5415f7a5d4d0c52c7e8"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.399262 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp" event={"ID":"c81045dc-62f0-4ae6-9e05-e26cc8f90611","Type":"ContainerStarted","Data":"412a495e16493d9d0708dca019e7e77cc18faf9a91386a6bede65fbef746fde8"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.400072 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp" Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.404634 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" event={"ID":"9aac26fb-f511-4491-a239-7c5f7ced5f43","Type":"ContainerStarted","Data":"df00f9bed7812f4f939acce583b15382da3c0bdef387af165f528f8f84adee80"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.405114 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.405931 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228" event={"ID":"6a63cd00-64f1-42cf-8250-abc3dfc3a4ff","Type":"ContainerStarted","Data":"3a391169e11a801beb5cdcd94b84a5ec4548a4969536e6f2c63f30c43618afa2"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.410643 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.411952 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv" event={"ID":"0418f548-554a-4efc-8494-4edc9d56fc7f","Type":"ContainerStarted","Data":"2ebfcf0f83cee7b30a8630c53db7e35df0f7c8c1c92e11234c4cf2a703d0d0f0"} Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.415681 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp" podStartSLOduration=3.723563316 podStartE2EDuration="12.415670518s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.288560098 +0000 UTC m=+997.342544997" lastFinishedPulling="2025-10-01 13:22:25.98066731 +0000 UTC m=+1006.034652199" observedRunningTime="2025-10-01 13:22:27.414205506 +0000 UTC m=+1007.468190405" watchObservedRunningTime="2025-10-01 13:22:27.415670518 +0000 UTC m=+1007.469655417" Oct 01 13:22:27 crc kubenswrapper[4749]: I1001 13:22:27.450919 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5fc59ccd99-c2sc8" podStartSLOduration=11.450902944 podStartE2EDuration="11.450902944s" podCreationTimestamp="2025-10-01 13:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:22:27.445109769 +0000 UTC m=+1007.499094668" watchObservedRunningTime="2025-10-01 13:22:27.450902944 +0000 UTC m=+1007.504887843" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.422806 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228" event={"ID":"6a63cd00-64f1-42cf-8250-abc3dfc3a4ff","Type":"ContainerStarted","Data":"58355bacbbbf7347809139e9236f564a40e29461d0d18774759de74cea25eb53"} Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.423026 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.424835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n" event={"ID":"40a29698-620f-45d7-b630-0cfe188dd09f","Type":"ContainerStarted","Data":"bbf88b5ed73a6185676392e7f2e2894e0419925b69fc9c5e777c70ba12fdc8a9"} Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.424975 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.439371 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs" event={"ID":"99a6dbcc-0b05-4471-b2d4-acacf72f6ff0","Type":"ContainerStarted","Data":"8a7209d0cd6953110dd4c4d518a3a1afc6915617c50d0212163f78b6a3beb17d"} Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.440062 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.457714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv" event={"ID":"0418f548-554a-4efc-8494-4edc9d56fc7f","Type":"ContainerStarted","Data":"ed1746abc3ef870ad3741d6e1afef968d5883f0c211fe17b56bd9810e5e63e06"} Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.460538 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp" event={"ID":"c81045dc-62f0-4ae6-9e05-e26cc8f90611","Type":"ContainerStarted","Data":"bcafb311f1b5ee5697180d3de48411454a5d7b87e083b74b12dab42e103aaa3e"} Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.460820 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228" podStartSLOduration=4.747701893 podStartE2EDuration="13.460801664s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.277090991 +0000 UTC m=+997.331075890" lastFinishedPulling="2025-10-01 13:22:25.990190742 +0000 UTC m=+1006.044175661" observedRunningTime="2025-10-01 13:22:28.449745798 +0000 UTC m=+1008.503730727" watchObservedRunningTime="2025-10-01 13:22:28.460801664 +0000 UTC m=+1008.514786583" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.463870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9" event={"ID":"1d38e134-a674-403f-9a4b-4ee8de1fe763","Type":"ContainerStarted","Data":"c6dcdb5fcd9df7b18197916a0e9552e92ab0cefdf3c0b50006a4a43458fc09d1"} Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.464489 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.466173 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5" event={"ID":"6647f74c-8bbb-490d-ade7-8b2fb5469ddc","Type":"ContainerStarted","Data":"da050a62be8a1c9dc070d9d0f46020055f1420a9d5f51a6b5708d0cba947117e"} Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.466797 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.483621 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc" event={"ID":"00534b7e-41c7-4935-8349-78aee327867e","Type":"ContainerStarted","Data":"0a6d874af6fbb9ea060d6b006d648fff7d3b47b6604f1d4058c10b2f57dfe03c"} Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.484465 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.486014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v" event={"ID":"a26beb61-7189-40d0-9284-e58654887bbd","Type":"ContainerStarted","Data":"7f8770816ddc7e809e8f1c3114715a71e72cf4a7b73b77edc3eb942be0f4d05a"} Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.486568 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.487981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf" event={"ID":"02d25e12-ca12-40f3-bc21-2b5a55fdba5d","Type":"ContainerStarted","Data":"9e3ac629382fb297458e106462636b746519d71010349e7d1d9f0fc7a3fcc9e7"} Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.488531 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.490146 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx" event={"ID":"11409643-1cee-49c5-b3d0-fa1ec4cb1af0","Type":"ContainerStarted","Data":"9a1131dc6b60e12051fd9ae7e6c237030637e1c76452073e4fefb37cc198aed4"} Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.490837 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.493571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm" event={"ID":"b544aac2-b3f7-453e-a05b-58f22b1b4fe1","Type":"ContainerStarted","Data":"17668543f71ac80d963d24421dc4296a0e0a84fb8b99aefcfa23d26ab0cdcf08"} Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.493685 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n" podStartSLOduration=4.7669175809999995 podStartE2EDuration="13.493662632s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.254779634 +0000 UTC m=+997.308764533" lastFinishedPulling="2025-10-01 13:22:25.981524675 +0000 UTC m=+1006.035509584" observedRunningTime="2025-10-01 13:22:28.474028551 +0000 UTC m=+1008.528013480" watchObservedRunningTime="2025-10-01 13:22:28.493662632 +0000 UTC m=+1008.547647541" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.494158 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.517978 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv" podStartSLOduration=4.41668042 podStartE2EDuration="13.517955756s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:16.880228568 +0000 UTC m=+996.934213467" lastFinishedPulling="2025-10-01 13:22:25.981503904 +0000 UTC m=+1006.035488803" observedRunningTime="2025-10-01 13:22:28.517382249 +0000 UTC m=+1008.571367168" watchObservedRunningTime="2025-10-01 13:22:28.517955756 +0000 UTC m=+1008.571940675" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.527185 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs" podStartSLOduration=4.147373048 podStartE2EDuration="13.527162418s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:16.599946634 +0000 UTC m=+996.653931533" lastFinishedPulling="2025-10-01 13:22:25.979735964 +0000 UTC m=+1006.033720903" observedRunningTime="2025-10-01 13:22:28.502778962 +0000 UTC m=+1008.556763861" watchObservedRunningTime="2025-10-01 13:22:28.527162418 +0000 UTC m=+1008.581147337" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.553094 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx" podStartSLOduration=4.84921527 podStartE2EDuration="13.553074358s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.276779472 +0000 UTC m=+997.330764371" lastFinishedPulling="2025-10-01 13:22:25.98063855 +0000 UTC m=+1006.034623459" observedRunningTime="2025-10-01 13:22:28.535327172 +0000 UTC m=+1008.589312091" watchObservedRunningTime="2025-10-01 13:22:28.553074358 +0000 UTC m=+1008.607059277" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.555624 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf" podStartSLOduration=4.820824221 podStartE2EDuration="13.555615421s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.274876478 +0000 UTC m=+997.328861377" lastFinishedPulling="2025-10-01 13:22:26.009667668 +0000 UTC m=+1006.063652577" observedRunningTime="2025-10-01 13:22:28.551693259 +0000 UTC m=+1008.605678188" watchObservedRunningTime="2025-10-01 13:22:28.555615421 +0000 UTC m=+1008.609600320" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.578580 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc" podStartSLOduration=4.853563365 podStartE2EDuration="13.578564596s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.254708972 +0000 UTC m=+997.308693871" lastFinishedPulling="2025-10-01 13:22:25.979710183 +0000 UTC m=+1006.033695102" observedRunningTime="2025-10-01 13:22:28.571498175 +0000 UTC m=+1008.625483094" watchObservedRunningTime="2025-10-01 13:22:28.578564596 +0000 UTC m=+1008.632549505" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.591622 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9" podStartSLOduration=5.047169803 podStartE2EDuration="13.591593448s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.435173565 +0000 UTC m=+997.489158464" lastFinishedPulling="2025-10-01 13:22:25.97959717 +0000 UTC m=+1006.033582109" observedRunningTime="2025-10-01 13:22:28.587737658 +0000 UTC m=+1008.641722557" watchObservedRunningTime="2025-10-01 13:22:28.591593448 +0000 UTC m=+1008.645578387" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.611640 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm" podStartSLOduration=4.925799188 podStartE2EDuration="13.61161406s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.295411984 +0000 UTC m=+997.349396883" lastFinishedPulling="2025-10-01 13:22:25.981226816 +0000 UTC m=+1006.035211755" observedRunningTime="2025-10-01 13:22:28.602297774 +0000 UTC m=+1008.656282723" watchObservedRunningTime="2025-10-01 13:22:28.61161406 +0000 UTC m=+1008.665598979" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.623447 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5" podStartSLOduration=4.528879473 podStartE2EDuration="13.623427048s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:16.879931869 +0000 UTC m=+996.933916758" lastFinishedPulling="2025-10-01 13:22:25.974479394 +0000 UTC m=+1006.028464333" observedRunningTime="2025-10-01 13:22:28.61897148 +0000 UTC m=+1008.672956379" watchObservedRunningTime="2025-10-01 13:22:28.623427048 +0000 UTC m=+1008.677411967" Oct 01 13:22:28 crc kubenswrapper[4749]: I1001 13:22:28.636786 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v" podStartSLOduration=4.94302585 podStartE2EDuration="13.636762818s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.285279655 +0000 UTC m=+997.339264554" lastFinishedPulling="2025-10-01 13:22:25.979016613 +0000 UTC m=+1006.033001522" observedRunningTime="2025-10-01 13:22:28.63367912 +0000 UTC m=+1008.687664059" watchObservedRunningTime="2025-10-01 13:22:28.636762818 +0000 UTC m=+1008.690747757" Oct 01 13:22:29 crc kubenswrapper[4749]: I1001 13:22:29.506832 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv" Oct 01 13:22:33 crc kubenswrapper[4749]: I1001 13:22:33.553323 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" event={"ID":"3bb59228-30b8-42af-b24c-dc50224fde04","Type":"ContainerStarted","Data":"1e04b143b8d771afb7667b0a0b1f68d5bb21bea202b7166d2bc68d6aa962aebd"} Oct 01 13:22:33 crc kubenswrapper[4749]: I1001 13:22:33.554188 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" Oct 01 13:22:33 crc kubenswrapper[4749]: I1001 13:22:33.554979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl" event={"ID":"33e04d57-8c80-4ed0-b05b-1edf290d476e","Type":"ContainerStarted","Data":"d612c51e52ad50ed59553a616b5ef5c5c58dff7f7a7ee7cac340d3bf3f300806"} Oct 01 13:22:33 crc kubenswrapper[4749]: I1001 13:22:33.577268 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" podStartSLOduration=3.239056419 podStartE2EDuration="18.577248318s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.300959312 +0000 UTC m=+997.354944201" lastFinishedPulling="2025-10-01 13:22:32.639151201 +0000 UTC m=+1012.693136100" observedRunningTime="2025-10-01 13:22:33.571107373 +0000 UTC m=+1013.625092272" watchObservedRunningTime="2025-10-01 13:22:33.577248318 +0000 UTC m=+1013.631233227" Oct 01 13:22:33 crc kubenswrapper[4749]: I1001 13:22:33.588325 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-x26nl" podStartSLOduration=2.429795867 podStartE2EDuration="17.588305134s" podCreationTimestamp="2025-10-01 13:22:16 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.47806272 +0000 UTC m=+997.532047619" lastFinishedPulling="2025-10-01 13:22:32.636571987 +0000 UTC m=+1012.690556886" observedRunningTime="2025-10-01 13:22:33.582254031 +0000 UTC m=+1013.636238950" watchObservedRunningTime="2025-10-01 13:22:33.588305134 +0000 UTC m=+1013.642290023" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.579191 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" event={"ID":"bf6d3a96-3f74-44df-8e75-1865612d0303","Type":"ContainerStarted","Data":"95e742dd247746fb4595d7e69ba477e6b9214e503fbea9f7cab04664847bc8d8"} Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.580923 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.601894 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" podStartSLOduration=2.8346743930000002 podStartE2EDuration="20.601857844s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.31839151 +0000 UTC m=+997.372376409" lastFinishedPulling="2025-10-01 13:22:35.085574961 +0000 UTC m=+1015.139559860" observedRunningTime="2025-10-01 13:22:35.597680045 +0000 UTC m=+1015.651664964" watchObservedRunningTime="2025-10-01 13:22:35.601857844 +0000 UTC m=+1015.655842753" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.607434 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" event={"ID":"ab9def29-b23e-4af8-828b-3c4151503a96","Type":"ContainerStarted","Data":"5b21e4b88ee22411a17326058d778535ef788e54b2964216835507f5bb934ea0"} Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.608064 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.609545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" event={"ID":"557ca73b-a87b-4d42-8d86-dfbd057ae1fd","Type":"ContainerStarted","Data":"1f6f7f51d8f2f6c24a8dc11aa8ebb234ad6e9272a009caa0863d003f22d6efd3"} Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.609959 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.611102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" event={"ID":"765c8918-abbc-47dd-8960-18292d54a9a0","Type":"ContainerStarted","Data":"137b7de0bb23e3b3f8e78c8dea12b6e84a8d556cde2faa73306552caefd9bb78"} Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.611524 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.613078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" event={"ID":"8f20ab81-68d3-4973-9336-d00440b811f9","Type":"ContainerStarted","Data":"704d17401431f65d65369f7fb327ec4c2afb603dab9c1aaf21f11b2f9adcc1d2"} Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.613478 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.627895 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" podStartSLOduration=3.439924868 podStartE2EDuration="20.627876527s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.439777017 +0000 UTC m=+997.493761916" lastFinishedPulling="2025-10-01 13:22:34.627728676 +0000 UTC m=+1014.681713575" observedRunningTime="2025-10-01 13:22:35.62656697 +0000 UTC m=+1015.680551869" watchObservedRunningTime="2025-10-01 13:22:35.627876527 +0000 UTC m=+1015.681861426" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.660458 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" podStartSLOduration=3.556886667 podStartE2EDuration="20.660441997s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.518445093 +0000 UTC m=+997.572429992" lastFinishedPulling="2025-10-01 13:22:34.622000423 +0000 UTC m=+1014.675985322" observedRunningTime="2025-10-01 13:22:35.657566685 +0000 UTC m=+1015.711551584" watchObservedRunningTime="2025-10-01 13:22:35.660441997 +0000 UTC m=+1015.714426896" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.676828 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" podStartSLOduration=3.4732503980000002 podStartE2EDuration="20.676809054s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.463936486 +0000 UTC m=+997.517921385" lastFinishedPulling="2025-10-01 13:22:34.667495142 +0000 UTC m=+1014.721480041" observedRunningTime="2025-10-01 13:22:35.674425876 +0000 UTC m=+1015.728410785" watchObservedRunningTime="2025-10-01 13:22:35.676809054 +0000 UTC m=+1015.730793973" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.695528 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" podStartSLOduration=3.072284529 podStartE2EDuration="20.695513878s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.493043498 +0000 UTC m=+997.547028397" lastFinishedPulling="2025-10-01 13:22:35.116272847 +0000 UTC m=+1015.170257746" observedRunningTime="2025-10-01 13:22:35.692626446 +0000 UTC m=+1015.746611345" watchObservedRunningTime="2025-10-01 13:22:35.695513878 +0000 UTC m=+1015.749498777" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.761003 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-vvxcs" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.778635 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-58njn" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.815719 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-ck4dv" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.819757 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-6w5m5" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.853499 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-h22cx" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.867155 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-4gdqf" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.956031 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-bhn8n" Oct 01 13:22:35 crc kubenswrapper[4749]: I1001 13:22:35.976699 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-292hc" Oct 01 13:22:36 crc kubenswrapper[4749]: I1001 13:22:36.147746 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-cwjsk" Oct 01 13:22:36 crc kubenswrapper[4749]: I1001 13:22:36.148357 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-8z228" Oct 01 13:22:36 crc kubenswrapper[4749]: I1001 13:22:36.159957 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-tzpfm" Oct 01 13:22:36 crc kubenswrapper[4749]: I1001 13:22:36.193738 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-5qtjp" Oct 01 13:22:36 crc kubenswrapper[4749]: I1001 13:22:36.196330 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-j4s7v" Oct 01 13:22:36 crc kubenswrapper[4749]: I1001 13:22:36.555269 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-56f5865b8b-4g6n9" Oct 01 13:22:36 crc kubenswrapper[4749]: I1001 13:22:36.624418 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" event={"ID":"acf0c68b-d009-4f2f-a21e-a2573842a063","Type":"ContainerStarted","Data":"f16da710102730416b70078d88fac20f94b912a884b3a72dae7cfab864ef2e46"} Oct 01 13:22:36 crc kubenswrapper[4749]: I1001 13:22:36.655799 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" podStartSLOduration=2.7394340120000003 podStartE2EDuration="21.655734698s" podCreationTimestamp="2025-10-01 13:22:15 +0000 UTC" firstStartedPulling="2025-10-01 13:22:17.453746755 +0000 UTC m=+997.507731654" lastFinishedPulling="2025-10-01 13:22:36.370047431 +0000 UTC m=+1016.424032340" observedRunningTime="2025-10-01 13:22:36.653650539 +0000 UTC m=+1016.707635528" watchObservedRunningTime="2025-10-01 13:22:36.655734698 +0000 UTC m=+1016.709719597" Oct 01 13:22:46 crc kubenswrapper[4749]: I1001 13:22:46.162410 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-8898x" Oct 01 13:22:46 crc kubenswrapper[4749]: I1001 13:22:46.219071 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-9s6hz" Oct 01 13:22:46 crc kubenswrapper[4749]: I1001 13:22:46.220885 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-jqvd5" Oct 01 13:22:46 crc kubenswrapper[4749]: I1001 13:22:46.224129 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xk2zl" Oct 01 13:22:46 crc kubenswrapper[4749]: I1001 13:22:46.230954 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-cr9ww" Oct 01 13:22:46 crc kubenswrapper[4749]: I1001 13:22:46.557002 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" Oct 01 13:22:46 crc kubenswrapper[4749]: I1001 13:22:46.558881 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-85777745bb-bdz5p" Oct 01 13:22:46 crc kubenswrapper[4749]: I1001 13:22:46.658133 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8cg66wx" Oct 01 13:23:03 crc kubenswrapper[4749]: I1001 13:23:03.940967 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77479b959-skc66"] Oct 01 13:23:03 crc kubenswrapper[4749]: I1001 13:23:03.943322 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77479b959-skc66" Oct 01 13:23:03 crc kubenswrapper[4749]: I1001 13:23:03.945768 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77479b959-skc66"] Oct 01 13:23:03 crc kubenswrapper[4749]: I1001 13:23:03.946836 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 01 13:23:03 crc kubenswrapper[4749]: I1001 13:23:03.946916 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 01 13:23:03 crc kubenswrapper[4749]: I1001 13:23:03.946920 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 01 13:23:03 crc kubenswrapper[4749]: I1001 13:23:03.946937 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-58drq" Oct 01 13:23:03 crc kubenswrapper[4749]: I1001 13:23:03.989287 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b8d888b5-l4l8w"] Oct 01 13:23:03 crc kubenswrapper[4749]: I1001 13:23:03.992344 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:03 crc kubenswrapper[4749]: I1001 13:23:03.995689 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.009130 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfafcf57-f586-4045-8de3-9047d545b210-config\") pod \"dnsmasq-dns-77479b959-skc66\" (UID: \"dfafcf57-f586-4045-8de3-9047d545b210\") " pod="openstack/dnsmasq-dns-77479b959-skc66" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.009199 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw7g8\" (UniqueName: \"kubernetes.io/projected/dfafcf57-f586-4045-8de3-9047d545b210-kube-api-access-lw7g8\") pod \"dnsmasq-dns-77479b959-skc66\" (UID: \"dfafcf57-f586-4045-8de3-9047d545b210\") " pod="openstack/dnsmasq-dns-77479b959-skc66" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.009238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58qxd\" (UniqueName: \"kubernetes.io/projected/cae23a60-143f-4004-82ed-340963f9f971-kube-api-access-58qxd\") pod \"dnsmasq-dns-8b8d888b5-l4l8w\" (UID: \"cae23a60-143f-4004-82ed-340963f9f971\") " pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.009276 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae23a60-143f-4004-82ed-340963f9f971-config\") pod \"dnsmasq-dns-8b8d888b5-l4l8w\" (UID: \"cae23a60-143f-4004-82ed-340963f9f971\") " pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.009297 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cae23a60-143f-4004-82ed-340963f9f971-dns-svc\") pod \"dnsmasq-dns-8b8d888b5-l4l8w\" (UID: \"cae23a60-143f-4004-82ed-340963f9f971\") " pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.017814 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8d888b5-l4l8w"] Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.110540 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfafcf57-f586-4045-8de3-9047d545b210-config\") pod \"dnsmasq-dns-77479b959-skc66\" (UID: \"dfafcf57-f586-4045-8de3-9047d545b210\") " pod="openstack/dnsmasq-dns-77479b959-skc66" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.110621 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw7g8\" (UniqueName: \"kubernetes.io/projected/dfafcf57-f586-4045-8de3-9047d545b210-kube-api-access-lw7g8\") pod \"dnsmasq-dns-77479b959-skc66\" (UID: \"dfafcf57-f586-4045-8de3-9047d545b210\") " pod="openstack/dnsmasq-dns-77479b959-skc66" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.110663 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58qxd\" (UniqueName: \"kubernetes.io/projected/cae23a60-143f-4004-82ed-340963f9f971-kube-api-access-58qxd\") pod \"dnsmasq-dns-8b8d888b5-l4l8w\" (UID: \"cae23a60-143f-4004-82ed-340963f9f971\") " pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.110682 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae23a60-143f-4004-82ed-340963f9f971-config\") pod \"dnsmasq-dns-8b8d888b5-l4l8w\" (UID: \"cae23a60-143f-4004-82ed-340963f9f971\") " pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.110702 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cae23a60-143f-4004-82ed-340963f9f971-dns-svc\") pod \"dnsmasq-dns-8b8d888b5-l4l8w\" (UID: \"cae23a60-143f-4004-82ed-340963f9f971\") " pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.111648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cae23a60-143f-4004-82ed-340963f9f971-dns-svc\") pod \"dnsmasq-dns-8b8d888b5-l4l8w\" (UID: \"cae23a60-143f-4004-82ed-340963f9f971\") " pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.111681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae23a60-143f-4004-82ed-340963f9f971-config\") pod \"dnsmasq-dns-8b8d888b5-l4l8w\" (UID: \"cae23a60-143f-4004-82ed-340963f9f971\") " pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.111778 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfafcf57-f586-4045-8de3-9047d545b210-config\") pod \"dnsmasq-dns-77479b959-skc66\" (UID: \"dfafcf57-f586-4045-8de3-9047d545b210\") " pod="openstack/dnsmasq-dns-77479b959-skc66" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.129894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw7g8\" (UniqueName: \"kubernetes.io/projected/dfafcf57-f586-4045-8de3-9047d545b210-kube-api-access-lw7g8\") pod \"dnsmasq-dns-77479b959-skc66\" (UID: \"dfafcf57-f586-4045-8de3-9047d545b210\") " pod="openstack/dnsmasq-dns-77479b959-skc66" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.131027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58qxd\" (UniqueName: \"kubernetes.io/projected/cae23a60-143f-4004-82ed-340963f9f971-kube-api-access-58qxd\") pod \"dnsmasq-dns-8b8d888b5-l4l8w\" (UID: \"cae23a60-143f-4004-82ed-340963f9f971\") " pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.263436 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77479b959-skc66" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.308313 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.746828 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77479b959-skc66"] Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.835172 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8d888b5-l4l8w"] Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.884136 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77479b959-skc66" event={"ID":"dfafcf57-f586-4045-8de3-9047d545b210","Type":"ContainerStarted","Data":"6962997d18017cda430d6a952a825a2999e3c61285237f72afc3bb3245963375"} Oct 01 13:23:04 crc kubenswrapper[4749]: I1001 13:23:04.888888 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" event={"ID":"cae23a60-143f-4004-82ed-340963f9f971","Type":"ContainerStarted","Data":"ff1def226d6c0901fbb9b4eaba7300a5f15a9e4a2f3fffdea1e618162dda3681"} Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.727162 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77479b959-skc66"] Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.747075 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-547bf6db69-wzvwk"] Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.748373 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.758069 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-547bf6db69-wzvwk"] Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.792376 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czbb4\" (UniqueName: \"kubernetes.io/projected/4e05980f-053b-45c7-8421-4f025831fc1c-kube-api-access-czbb4\") pod \"dnsmasq-dns-547bf6db69-wzvwk\" (UID: \"4e05980f-053b-45c7-8421-4f025831fc1c\") " pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.796146 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e05980f-053b-45c7-8421-4f025831fc1c-config\") pod \"dnsmasq-dns-547bf6db69-wzvwk\" (UID: \"4e05980f-053b-45c7-8421-4f025831fc1c\") " pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.796265 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e05980f-053b-45c7-8421-4f025831fc1c-dns-svc\") pod \"dnsmasq-dns-547bf6db69-wzvwk\" (UID: \"4e05980f-053b-45c7-8421-4f025831fc1c\") " pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.897990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e05980f-053b-45c7-8421-4f025831fc1c-config\") pod \"dnsmasq-dns-547bf6db69-wzvwk\" (UID: \"4e05980f-053b-45c7-8421-4f025831fc1c\") " pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.898061 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e05980f-053b-45c7-8421-4f025831fc1c-dns-svc\") pod \"dnsmasq-dns-547bf6db69-wzvwk\" (UID: \"4e05980f-053b-45c7-8421-4f025831fc1c\") " pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.898104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czbb4\" (UniqueName: \"kubernetes.io/projected/4e05980f-053b-45c7-8421-4f025831fc1c-kube-api-access-czbb4\") pod \"dnsmasq-dns-547bf6db69-wzvwk\" (UID: \"4e05980f-053b-45c7-8421-4f025831fc1c\") " pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.899409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e05980f-053b-45c7-8421-4f025831fc1c-config\") pod \"dnsmasq-dns-547bf6db69-wzvwk\" (UID: \"4e05980f-053b-45c7-8421-4f025831fc1c\") " pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.900047 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e05980f-053b-45c7-8421-4f025831fc1c-dns-svc\") pod \"dnsmasq-dns-547bf6db69-wzvwk\" (UID: \"4e05980f-053b-45c7-8421-4f025831fc1c\") " pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:07 crc kubenswrapper[4749]: I1001 13:23:07.921027 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czbb4\" (UniqueName: \"kubernetes.io/projected/4e05980f-053b-45c7-8421-4f025831fc1c-kube-api-access-czbb4\") pod \"dnsmasq-dns-547bf6db69-wzvwk\" (UID: \"4e05980f-053b-45c7-8421-4f025831fc1c\") " pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.039111 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8d888b5-l4l8w"] Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.067533 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.071827 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-669746885c-bsc46"] Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.073095 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.083362 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-669746885c-bsc46"] Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.201496 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjg6g\" (UniqueName: \"kubernetes.io/projected/42ec794e-8dae-4b0a-a77c-1f2f56047842-kube-api-access-qjg6g\") pod \"dnsmasq-dns-669746885c-bsc46\" (UID: \"42ec794e-8dae-4b0a-a77c-1f2f56047842\") " pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.201810 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ec794e-8dae-4b0a-a77c-1f2f56047842-dns-svc\") pod \"dnsmasq-dns-669746885c-bsc46\" (UID: \"42ec794e-8dae-4b0a-a77c-1f2f56047842\") " pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.201958 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ec794e-8dae-4b0a-a77c-1f2f56047842-config\") pod \"dnsmasq-dns-669746885c-bsc46\" (UID: \"42ec794e-8dae-4b0a-a77c-1f2f56047842\") " pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.303817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ec794e-8dae-4b0a-a77c-1f2f56047842-config\") pod \"dnsmasq-dns-669746885c-bsc46\" (UID: \"42ec794e-8dae-4b0a-a77c-1f2f56047842\") " pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.303906 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjg6g\" (UniqueName: \"kubernetes.io/projected/42ec794e-8dae-4b0a-a77c-1f2f56047842-kube-api-access-qjg6g\") pod \"dnsmasq-dns-669746885c-bsc46\" (UID: \"42ec794e-8dae-4b0a-a77c-1f2f56047842\") " pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.303929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ec794e-8dae-4b0a-a77c-1f2f56047842-dns-svc\") pod \"dnsmasq-dns-669746885c-bsc46\" (UID: \"42ec794e-8dae-4b0a-a77c-1f2f56047842\") " pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.304860 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547bf6db69-wzvwk"] Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.305368 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ec794e-8dae-4b0a-a77c-1f2f56047842-config\") pod \"dnsmasq-dns-669746885c-bsc46\" (UID: \"42ec794e-8dae-4b0a-a77c-1f2f56047842\") " pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.305384 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ec794e-8dae-4b0a-a77c-1f2f56047842-dns-svc\") pod \"dnsmasq-dns-669746885c-bsc46\" (UID: \"42ec794e-8dae-4b0a-a77c-1f2f56047842\") " pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.327760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjg6g\" (UniqueName: \"kubernetes.io/projected/42ec794e-8dae-4b0a-a77c-1f2f56047842-kube-api-access-qjg6g\") pod \"dnsmasq-dns-669746885c-bsc46\" (UID: \"42ec794e-8dae-4b0a-a77c-1f2f56047842\") " pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.343261 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6749c445df-rrs9k"] Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.344630 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.355720 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6749c445df-rrs9k"] Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.392946 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.406873 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwlcn\" (UniqueName: \"kubernetes.io/projected/a35c1692-bf33-4303-9783-4b734e8d5aa4-kube-api-access-nwlcn\") pod \"dnsmasq-dns-6749c445df-rrs9k\" (UID: \"a35c1692-bf33-4303-9783-4b734e8d5aa4\") " pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.406926 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a35c1692-bf33-4303-9783-4b734e8d5aa4-dns-svc\") pod \"dnsmasq-dns-6749c445df-rrs9k\" (UID: \"a35c1692-bf33-4303-9783-4b734e8d5aa4\") " pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.407002 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35c1692-bf33-4303-9783-4b734e8d5aa4-config\") pod \"dnsmasq-dns-6749c445df-rrs9k\" (UID: \"a35c1692-bf33-4303-9783-4b734e8d5aa4\") " pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.507912 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35c1692-bf33-4303-9783-4b734e8d5aa4-config\") pod \"dnsmasq-dns-6749c445df-rrs9k\" (UID: \"a35c1692-bf33-4303-9783-4b734e8d5aa4\") " pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.507976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwlcn\" (UniqueName: \"kubernetes.io/projected/a35c1692-bf33-4303-9783-4b734e8d5aa4-kube-api-access-nwlcn\") pod \"dnsmasq-dns-6749c445df-rrs9k\" (UID: \"a35c1692-bf33-4303-9783-4b734e8d5aa4\") " pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.508006 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a35c1692-bf33-4303-9783-4b734e8d5aa4-dns-svc\") pod \"dnsmasq-dns-6749c445df-rrs9k\" (UID: \"a35c1692-bf33-4303-9783-4b734e8d5aa4\") " pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.508742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35c1692-bf33-4303-9783-4b734e8d5aa4-config\") pod \"dnsmasq-dns-6749c445df-rrs9k\" (UID: \"a35c1692-bf33-4303-9783-4b734e8d5aa4\") " pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.508786 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a35c1692-bf33-4303-9783-4b734e8d5aa4-dns-svc\") pod \"dnsmasq-dns-6749c445df-rrs9k\" (UID: \"a35c1692-bf33-4303-9783-4b734e8d5aa4\") " pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.523036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwlcn\" (UniqueName: \"kubernetes.io/projected/a35c1692-bf33-4303-9783-4b734e8d5aa4-kube-api-access-nwlcn\") pod \"dnsmasq-dns-6749c445df-rrs9k\" (UID: \"a35c1692-bf33-4303-9783-4b734e8d5aa4\") " pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.670266 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.907002 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.908197 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.912670 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.915121 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.915286 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.915397 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.915299 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.915326 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8tf6x" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.916992 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 13:23:08 crc kubenswrapper[4749]: I1001 13:23:08.929879 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.013694 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.013754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.013790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35e1759f-e27a-4891-9fc0-37753b25689d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.013930 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.013968 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.013986 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.014002 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.014122 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.014188 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.014289 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxhgw\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-kube-api-access-gxhgw\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.014328 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35e1759f-e27a-4891-9fc0-37753b25689d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.115856 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.115928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35e1759f-e27a-4891-9fc0-37753b25689d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.115971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.116086 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.116112 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.116168 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.116185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.116229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.116268 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxhgw\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-kube-api-access-gxhgw\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.116294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35e1759f-e27a-4891-9fc0-37753b25689d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.116334 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.117506 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.117549 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.117672 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.117727 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.118979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.119293 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.122620 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35e1759f-e27a-4891-9fc0-37753b25689d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.126298 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.126790 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35e1759f-e27a-4891-9fc0-37753b25689d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.132175 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.134817 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxhgw\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-kube-api-access-gxhgw\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.152598 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.184958 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.196647 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.198628 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.205689 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.205692 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.205953 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.206076 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.206321 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.207413 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.210876 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-7gp4p" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.225996 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.318914 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d655140-d63d-4e40-8de1-875213f37d4a-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.318959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhfvw\" (UniqueName: \"kubernetes.io/projected/1d655140-d63d-4e40-8de1-875213f37d4a-kube-api-access-dhfvw\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.319025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.319048 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d655140-d63d-4e40-8de1-875213f37d4a-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.319182 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d655140-d63d-4e40-8de1-875213f37d4a-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.319326 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d655140-d63d-4e40-8de1-875213f37d4a-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.319387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d655140-d63d-4e40-8de1-875213f37d4a-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.319413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d655140-d63d-4e40-8de1-875213f37d4a-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.319451 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d655140-d63d-4e40-8de1-875213f37d4a-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.319476 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d655140-d63d-4e40-8de1-875213f37d4a-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.319561 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d655140-d63d-4e40-8de1-875213f37d4a-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.421735 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d655140-d63d-4e40-8de1-875213f37d4a-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.422103 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhfvw\" (UniqueName: \"kubernetes.io/projected/1d655140-d63d-4e40-8de1-875213f37d4a-kube-api-access-dhfvw\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.422422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.422696 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d655140-d63d-4e40-8de1-875213f37d4a-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.422944 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d655140-d63d-4e40-8de1-875213f37d4a-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.422860 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.423254 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1d655140-d63d-4e40-8de1-875213f37d4a-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.423507 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d655140-d63d-4e40-8de1-875213f37d4a-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.423765 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d655140-d63d-4e40-8de1-875213f37d4a-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.423993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d655140-d63d-4e40-8de1-875213f37d4a-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.424290 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d655140-d63d-4e40-8de1-875213f37d4a-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.424482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d655140-d63d-4e40-8de1-875213f37d4a-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.424822 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d655140-d63d-4e40-8de1-875213f37d4a-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.424427 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1d655140-d63d-4e40-8de1-875213f37d4a-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.425792 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1d655140-d63d-4e40-8de1-875213f37d4a-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.426449 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1d655140-d63d-4e40-8de1-875213f37d4a-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.426662 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1d655140-d63d-4e40-8de1-875213f37d4a-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.427358 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1d655140-d63d-4e40-8de1-875213f37d4a-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.427766 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1d655140-d63d-4e40-8de1-875213f37d4a-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.428113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1d655140-d63d-4e40-8de1-875213f37d4a-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.430697 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1d655140-d63d-4e40-8de1-875213f37d4a-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.445179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhfvw\" (UniqueName: \"kubernetes.io/projected/1d655140-d63d-4e40-8de1-875213f37d4a-kube-api-access-dhfvw\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.447876 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.449385 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.452526 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.452947 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.453080 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.453338 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gfcg5" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.453475 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.453577 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.455354 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.457123 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.467081 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"1d655140-d63d-4e40-8de1-875213f37d4a\") " pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.526391 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.526498 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.527288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.527357 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.527406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-config-data\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.527441 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.527495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.527538 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.527573 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89621c5c-1d46-44be-852f-1a37dccf02e9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.527617 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89621c5c-1d46-44be-852f-1a37dccf02e9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.527650 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rdjp\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-kube-api-access-7rdjp\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.531555 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.629753 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.629844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.629877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.629910 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-config-data\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.629933 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.629968 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.629996 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.630018 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89621c5c-1d46-44be-852f-1a37dccf02e9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.630079 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89621c5c-1d46-44be-852f-1a37dccf02e9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.630104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rdjp\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-kube-api-access-7rdjp\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.630163 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.631465 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.631904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.632096 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.632838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-config-data\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.633538 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.635144 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89621c5c-1d46-44be-852f-1a37dccf02e9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.635303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.635681 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.635876 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89621c5c-1d46-44be-852f-1a37dccf02e9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.637151 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.649751 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rdjp\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-kube-api-access-7rdjp\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.653668 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " pod="openstack/rabbitmq-server-0" Oct 01 13:23:09 crc kubenswrapper[4749]: I1001 13:23:09.816412 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.887947 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.890890 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.894724 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.895021 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bcqht" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.895134 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.895671 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.895967 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.900611 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.902087 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.986321 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.986362 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.986585 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-kolla-config\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.986737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-operator-scripts\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.986783 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-secrets\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.986922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-config-data-generated\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.986994 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.987129 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-config-data-default\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:11 crc kubenswrapper[4749]: I1001 13:23:11.987162 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qszzp\" (UniqueName: \"kubernetes.io/projected/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-kube-api-access-qszzp\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.088127 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.088230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-config-data-default\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.088255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qszzp\" (UniqueName: \"kubernetes.io/projected/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-kube-api-access-qszzp\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.088284 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.088310 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.088334 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-kolla-config\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.088379 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-operator-scripts\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.088402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-secrets\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.088449 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-config-data-generated\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.088887 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-config-data-generated\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.089156 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.089613 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-config-data-default\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.090092 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-kolla-config\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.091499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-operator-scripts\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.097882 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-secrets\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.098109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.108608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qszzp\" (UniqueName: \"kubernetes.io/projected/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-kube-api-access-qszzp\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.109369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/215f6c11-74c0-4e5e-a39d-8af23dd5e4af-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.113018 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"215f6c11-74c0-4e5e-a39d-8af23dd5e4af\") " pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.209684 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.847341 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.851851 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.855911 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-n7vfg" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.856242 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.856448 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.856515 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.856655 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.898303 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b75b73e1-aac4-41c8-9ad7-afe216cf9741-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.898367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b75b73e1-aac4-41c8-9ad7-afe216cf9741-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.898457 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh6wx\" (UniqueName: \"kubernetes.io/projected/b75b73e1-aac4-41c8-9ad7-afe216cf9741-kube-api-access-dh6wx\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.898484 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75b73e1-aac4-41c8-9ad7-afe216cf9741-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.898505 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b75b73e1-aac4-41c8-9ad7-afe216cf9741-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.898536 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.898571 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b75b73e1-aac4-41c8-9ad7-afe216cf9741-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.907517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b75b73e1-aac4-41c8-9ad7-afe216cf9741-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:12 crc kubenswrapper[4749]: I1001 13:23:12.907577 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b75b73e1-aac4-41c8-9ad7-afe216cf9741-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.009583 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.009878 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.010840 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b75b73e1-aac4-41c8-9ad7-afe216cf9741-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.009952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b75b73e1-aac4-41c8-9ad7-afe216cf9741-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.012644 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b75b73e1-aac4-41c8-9ad7-afe216cf9741-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.012684 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b75b73e1-aac4-41c8-9ad7-afe216cf9741-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.012823 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b75b73e1-aac4-41c8-9ad7-afe216cf9741-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.012858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b75b73e1-aac4-41c8-9ad7-afe216cf9741-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.012996 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh6wx\" (UniqueName: \"kubernetes.io/projected/b75b73e1-aac4-41c8-9ad7-afe216cf9741-kube-api-access-dh6wx\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.013024 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b75b73e1-aac4-41c8-9ad7-afe216cf9741-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.013047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75b73e1-aac4-41c8-9ad7-afe216cf9741-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.013595 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b75b73e1-aac4-41c8-9ad7-afe216cf9741-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.013786 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b75b73e1-aac4-41c8-9ad7-afe216cf9741-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.014194 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b75b73e1-aac4-41c8-9ad7-afe216cf9741-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.017032 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/b75b73e1-aac4-41c8-9ad7-afe216cf9741-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.017172 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b75b73e1-aac4-41c8-9ad7-afe216cf9741-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.017806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b75b73e1-aac4-41c8-9ad7-afe216cf9741-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.030756 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh6wx\" (UniqueName: \"kubernetes.io/projected/b75b73e1-aac4-41c8-9ad7-afe216cf9741-kube-api-access-dh6wx\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.043403 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"b75b73e1-aac4-41c8-9ad7-afe216cf9741\") " pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.075004 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.076326 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.079281 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.079469 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-56t5h" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.079626 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.082726 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.114904 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.114961 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftcn6\" (UniqueName: \"kubernetes.io/projected/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-kube-api-access-ftcn6\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.115045 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.115070 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-config-data\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.115097 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-kolla-config\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.188373 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.216941 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.217073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-config-data\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.217145 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-kolla-config\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.217257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.217290 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftcn6\" (UniqueName: \"kubernetes.io/projected/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-kube-api-access-ftcn6\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.218141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-kolla-config\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.218789 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-config-data\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.222782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-memcached-tls-certs\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.222796 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-combined-ca-bundle\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.234147 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftcn6\" (UniqueName: \"kubernetes.io/projected/6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09-kube-api-access-ftcn6\") pod \"memcached-0\" (UID: \"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09\") " pod="openstack/memcached-0" Oct 01 13:23:13 crc kubenswrapper[4749]: I1001 13:23:13.395359 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 13:23:14 crc kubenswrapper[4749]: I1001 13:23:14.828420 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:23:14 crc kubenswrapper[4749]: I1001 13:23:14.829980 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:23:14 crc kubenswrapper[4749]: I1001 13:23:14.832258 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-x767n" Oct 01 13:23:14 crc kubenswrapper[4749]: I1001 13:23:14.842922 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:23:14 crc kubenswrapper[4749]: I1001 13:23:14.947422 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzwd6\" (UniqueName: \"kubernetes.io/projected/e2054662-5786-4a04-a7c9-16fe32a04610-kube-api-access-nzwd6\") pod \"kube-state-metrics-0\" (UID: \"e2054662-5786-4a04-a7c9-16fe32a04610\") " pod="openstack/kube-state-metrics-0" Oct 01 13:23:15 crc kubenswrapper[4749]: I1001 13:23:15.049231 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzwd6\" (UniqueName: \"kubernetes.io/projected/e2054662-5786-4a04-a7c9-16fe32a04610-kube-api-access-nzwd6\") pod \"kube-state-metrics-0\" (UID: \"e2054662-5786-4a04-a7c9-16fe32a04610\") " pod="openstack/kube-state-metrics-0" Oct 01 13:23:15 crc kubenswrapper[4749]: I1001 13:23:15.073056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzwd6\" (UniqueName: \"kubernetes.io/projected/e2054662-5786-4a04-a7c9-16fe32a04610-kube-api-access-nzwd6\") pod \"kube-state-metrics-0\" (UID: \"e2054662-5786-4a04-a7c9-16fe32a04610\") " pod="openstack/kube-state-metrics-0" Oct 01 13:23:15 crc kubenswrapper[4749]: I1001 13:23:15.207382 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.163131 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.170338 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.172294 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.173083 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-df59c" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.173404 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.173703 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.173981 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.180518 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.189702 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.266822 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.266903 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-config\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.266965 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0620852a-daa1-4b0a-91ea-910dd2c379c9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.267028 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0620852a-daa1-4b0a-91ea-910dd2c379c9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.267062 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0620852a-daa1-4b0a-91ea-910dd2c379c9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.267120 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44qf4\" (UniqueName: \"kubernetes.io/projected/0620852a-daa1-4b0a-91ea-910dd2c379c9-kube-api-access-44qf4\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.267207 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.267260 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.368075 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.368122 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-config\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.368160 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0620852a-daa1-4b0a-91ea-910dd2c379c9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.368193 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0620852a-daa1-4b0a-91ea-910dd2c379c9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.368362 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0620852a-daa1-4b0a-91ea-910dd2c379c9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.368470 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44qf4\" (UniqueName: \"kubernetes.io/projected/0620852a-daa1-4b0a-91ea-910dd2c379c9-kube-api-access-44qf4\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.368526 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.368579 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.369775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0620852a-daa1-4b0a-91ea-910dd2c379c9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.372731 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.372791 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/486c2550edb1c82035b2963cc60708287426e0ee5d361c9f2f060b32a3c68a50/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.373376 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.375645 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0620852a-daa1-4b0a-91ea-910dd2c379c9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.378835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0620852a-daa1-4b0a-91ea-910dd2c379c9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.383247 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-config\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.385107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44qf4\" (UniqueName: \"kubernetes.io/projected/0620852a-daa1-4b0a-91ea-910dd2c379c9-kube-api-access-44qf4\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.385155 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.415328 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"prometheus-metric-storage-0\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:16 crc kubenswrapper[4749]: I1001 13:23:16.512530 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.269434 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.272004 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.274722 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.274780 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.274963 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pwtq2" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.275012 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.276140 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.285464 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.421871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtgnl\" (UniqueName: \"kubernetes.io/projected/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-kube-api-access-xtgnl\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.422000 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.422024 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.422040 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.422075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-config\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.422103 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.422131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.422298 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.486915 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sl4xv"] Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.487881 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.489638 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-dt4fg" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.490315 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.490674 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.498280 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sl4xv"] Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.523553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.523595 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.523616 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.523651 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtgnl\" (UniqueName: \"kubernetes.io/projected/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-kube-api-access-xtgnl\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.523720 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.523737 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.523754 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.523776 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-config\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.524775 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.525535 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.526776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-config\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.530111 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.537059 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.540248 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtgnl\" (UniqueName: \"kubernetes.io/projected/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-kube-api-access-xtgnl\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.542899 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.545956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb007c1f-6b53-4c8a-9921-85ccd3d5dad5-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.566296 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-75t94"] Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.566920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5\") " pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.568313 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.579482 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-75t94"] Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.624942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4035d0d3-eeec-429f-b31e-ab4649ecf92a-var-run-ovn\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.625025 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4035d0d3-eeec-429f-b31e-ab4649ecf92a-scripts\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.625048 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4035d0d3-eeec-429f-b31e-ab4649ecf92a-var-log-ovn\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.625085 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4035d0d3-eeec-429f-b31e-ab4649ecf92a-var-run\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.625163 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4035d0d3-eeec-429f-b31e-ab4649ecf92a-ovn-controller-tls-certs\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.625197 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4035d0d3-eeec-429f-b31e-ab4649ecf92a-combined-ca-bundle\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.625230 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p5xx\" (UniqueName: \"kubernetes.io/projected/4035d0d3-eeec-429f-b31e-ab4649ecf92a-kube-api-access-2p5xx\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.632465 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.726691 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4035d0d3-eeec-429f-b31e-ab4649ecf92a-var-run\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.726751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4035d0d3-eeec-429f-b31e-ab4649ecf92a-ovn-controller-tls-certs\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.726783 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/45d86d96-b332-498e-a952-c34007c2f07b-var-lib\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.726801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45d86d96-b332-498e-a952-c34007c2f07b-var-run\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.726819 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4035d0d3-eeec-429f-b31e-ab4649ecf92a-combined-ca-bundle\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.726833 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p5xx\" (UniqueName: \"kubernetes.io/projected/4035d0d3-eeec-429f-b31e-ab4649ecf92a-kube-api-access-2p5xx\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.726850 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/45d86d96-b332-498e-a952-c34007c2f07b-var-log\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.726889 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45d86d96-b332-498e-a952-c34007c2f07b-scripts\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.726947 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4035d0d3-eeec-429f-b31e-ab4649ecf92a-var-run-ovn\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.727016 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4035d0d3-eeec-429f-b31e-ab4649ecf92a-scripts\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.727035 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4035d0d3-eeec-429f-b31e-ab4649ecf92a-var-log-ovn\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.727051 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/45d86d96-b332-498e-a952-c34007c2f07b-etc-ovs\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.727076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nms4h\" (UniqueName: \"kubernetes.io/projected/45d86d96-b332-498e-a952-c34007c2f07b-kube-api-access-nms4h\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.727561 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4035d0d3-eeec-429f-b31e-ab4649ecf92a-var-run-ovn\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.728002 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4035d0d3-eeec-429f-b31e-ab4649ecf92a-var-run\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.729448 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4035d0d3-eeec-429f-b31e-ab4649ecf92a-scripts\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.729635 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4035d0d3-eeec-429f-b31e-ab4649ecf92a-var-log-ovn\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.731466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4035d0d3-eeec-429f-b31e-ab4649ecf92a-ovn-controller-tls-certs\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.740289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4035d0d3-eeec-429f-b31e-ab4649ecf92a-combined-ca-bundle\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.743253 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p5xx\" (UniqueName: \"kubernetes.io/projected/4035d0d3-eeec-429f-b31e-ab4649ecf92a-kube-api-access-2p5xx\") pod \"ovn-controller-sl4xv\" (UID: \"4035d0d3-eeec-429f-b31e-ab4649ecf92a\") " pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.812514 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.828564 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/45d86d96-b332-498e-a952-c34007c2f07b-etc-ovs\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.828617 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nms4h\" (UniqueName: \"kubernetes.io/projected/45d86d96-b332-498e-a952-c34007c2f07b-kube-api-access-nms4h\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.828674 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/45d86d96-b332-498e-a952-c34007c2f07b-var-lib\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.828696 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45d86d96-b332-498e-a952-c34007c2f07b-var-run\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.828721 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/45d86d96-b332-498e-a952-c34007c2f07b-var-log\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.828773 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45d86d96-b332-498e-a952-c34007c2f07b-scripts\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.828796 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/45d86d96-b332-498e-a952-c34007c2f07b-etc-ovs\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.828884 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/45d86d96-b332-498e-a952-c34007c2f07b-var-log\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.829084 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/45d86d96-b332-498e-a952-c34007c2f07b-var-run\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.829402 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/45d86d96-b332-498e-a952-c34007c2f07b-var-lib\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.830861 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45d86d96-b332-498e-a952-c34007c2f07b-scripts\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.844719 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nms4h\" (UniqueName: \"kubernetes.io/projected/45d86d96-b332-498e-a952-c34007c2f07b-kube-api-access-nms4h\") pod \"ovn-controller-ovs-75t94\" (UID: \"45d86d96-b332-498e-a952-c34007c2f07b\") " pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:19 crc kubenswrapper[4749]: I1001 13:23:19.925272 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:20 crc kubenswrapper[4749]: E1001 13:23:20.943774 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 01 13:23:20 crc kubenswrapper[4749]: E1001 13:23:20.943831 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 01 13:23:20 crc kubenswrapper[4749]: E1001 13:23:20.944011 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.30:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-58qxd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8b8d888b5-l4l8w_openstack(cae23a60-143f-4004-82ed-340963f9f971): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:23:20 crc kubenswrapper[4749]: E1001 13:23:20.945283 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" podUID="cae23a60-143f-4004-82ed-340963f9f971" Oct 01 13:23:21 crc kubenswrapper[4749]: E1001 13:23:21.000045 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 01 13:23:21 crc kubenswrapper[4749]: E1001 13:23:21.000399 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 01 13:23:21 crc kubenswrapper[4749]: E1001 13:23:21.000528 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.30:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lw7g8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-77479b959-skc66_openstack(dfafcf57-f586-4045-8de3-9047d545b210): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:23:21 crc kubenswrapper[4749]: E1001 13:23:21.009286 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-77479b959-skc66" podUID="dfafcf57-f586-4045-8de3-9047d545b210" Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.781612 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77479b959-skc66" Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.786721 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.842517 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:23:21 crc kubenswrapper[4749]: W1001 13:23:21.843110 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d655140_d63d_4e40_8de1_875213f37d4a.slice/crio-3c2f213217c78d9449f7407d28a397c8d941feeec69658d9607e2f5ba98dcda9 WatchSource:0}: Error finding container 3c2f213217c78d9449f7407d28a397c8d941feeec69658d9607e2f5ba98dcda9: Status 404 returned error can't find the container with id 3c2f213217c78d9449f7407d28a397c8d941feeec69658d9607e2f5ba98dcda9 Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.863161 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 01 13:23:21 crc kubenswrapper[4749]: W1001 13:23:21.872201 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35e1759f_e27a_4891_9fc0_37753b25689d.slice/crio-f48f67800299fda7910707cb5ce3116921337ba37ef85bfbd1284727b8e01a1e WatchSource:0}: Error finding container f48f67800299fda7910707cb5ce3116921337ba37ef85bfbd1284727b8e01a1e: Status 404 returned error can't find the container with id f48f67800299fda7910707cb5ce3116921337ba37ef85bfbd1284727b8e01a1e Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.872721 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.872952 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfafcf57-f586-4045-8de3-9047d545b210-config\") pod \"dfafcf57-f586-4045-8de3-9047d545b210\" (UID: \"dfafcf57-f586-4045-8de3-9047d545b210\") " Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.873182 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw7g8\" (UniqueName: \"kubernetes.io/projected/dfafcf57-f586-4045-8de3-9047d545b210-kube-api-access-lw7g8\") pod \"dfafcf57-f586-4045-8de3-9047d545b210\" (UID: \"dfafcf57-f586-4045-8de3-9047d545b210\") " Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.874188 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfafcf57-f586-4045-8de3-9047d545b210-config" (OuterVolumeSpecName: "config") pod "dfafcf57-f586-4045-8de3-9047d545b210" (UID: "dfafcf57-f586-4045-8de3-9047d545b210"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.881808 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfafcf57-f586-4045-8de3-9047d545b210-kube-api-access-lw7g8" (OuterVolumeSpecName: "kube-api-access-lw7g8") pod "dfafcf57-f586-4045-8de3-9047d545b210" (UID: "dfafcf57-f586-4045-8de3-9047d545b210"). InnerVolumeSpecName "kube-api-access-lw7g8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.975748 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae23a60-143f-4004-82ed-340963f9f971-config\") pod \"cae23a60-143f-4004-82ed-340963f9f971\" (UID: \"cae23a60-143f-4004-82ed-340963f9f971\") " Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.975805 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cae23a60-143f-4004-82ed-340963f9f971-dns-svc\") pod \"cae23a60-143f-4004-82ed-340963f9f971\" (UID: \"cae23a60-143f-4004-82ed-340963f9f971\") " Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.975992 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58qxd\" (UniqueName: \"kubernetes.io/projected/cae23a60-143f-4004-82ed-340963f9f971-kube-api-access-58qxd\") pod \"cae23a60-143f-4004-82ed-340963f9f971\" (UID: \"cae23a60-143f-4004-82ed-340963f9f971\") " Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.976278 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cae23a60-143f-4004-82ed-340963f9f971-config" (OuterVolumeSpecName: "config") pod "cae23a60-143f-4004-82ed-340963f9f971" (UID: "cae23a60-143f-4004-82ed-340963f9f971"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.976564 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw7g8\" (UniqueName: \"kubernetes.io/projected/dfafcf57-f586-4045-8de3-9047d545b210-kube-api-access-lw7g8\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.976581 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cae23a60-143f-4004-82ed-340963f9f971-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.976593 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfafcf57-f586-4045-8de3-9047d545b210-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.976877 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cae23a60-143f-4004-82ed-340963f9f971-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cae23a60-143f-4004-82ed-340963f9f971" (UID: "cae23a60-143f-4004-82ed-340963f9f971"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:21 crc kubenswrapper[4749]: I1001 13:23:21.978970 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae23a60-143f-4004-82ed-340963f9f971-kube-api-access-58qxd" (OuterVolumeSpecName: "kube-api-access-58qxd") pod "cae23a60-143f-4004-82ed-340963f9f971" (UID: "cae23a60-143f-4004-82ed-340963f9f971"). InnerVolumeSpecName "kube-api-access-58qxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.073158 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.078532 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58qxd\" (UniqueName: \"kubernetes.io/projected/cae23a60-143f-4004-82ed-340963f9f971-kube-api-access-58qxd\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.078660 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cae23a60-143f-4004-82ed-340963f9f971-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.081943 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 13:23:22 crc kubenswrapper[4749]: W1001 13:23:22.083529 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb75b73e1_aac4_41c8_9ad7_afe216cf9741.slice/crio-3cbbdec86270de496b61d54efbc5e92cc7ff2126facf5cf2dfcc320d71b162ae WatchSource:0}: Error finding container 3cbbdec86270de496b61d54efbc5e92cc7ff2126facf5cf2dfcc320d71b162ae: Status 404 returned error can't find the container with id 3cbbdec86270de496b61d54efbc5e92cc7ff2126facf5cf2dfcc320d71b162ae Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.116013 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09","Type":"ContainerStarted","Data":"f9f867993c8125435f3bb21dd5f11b8962ae2656fc8c9ddf63db6a04817fc440"} Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.119533 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35e1759f-e27a-4891-9fc0-37753b25689d","Type":"ContainerStarted","Data":"f48f67800299fda7910707cb5ce3116921337ba37ef85bfbd1284727b8e01a1e"} Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.121032 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-669746885c-bsc46"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.121627 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"1d655140-d63d-4e40-8de1-875213f37d4a","Type":"ContainerStarted","Data":"3c2f213217c78d9449f7407d28a397c8d941feeec69658d9607e2f5ba98dcda9"} Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.123370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77479b959-skc66" event={"ID":"dfafcf57-f586-4045-8de3-9047d545b210","Type":"ContainerDied","Data":"6962997d18017cda430d6a952a825a2999e3c61285237f72afc3bb3245963375"} Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.123429 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77479b959-skc66" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.130448 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6749c445df-rrs9k"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.167323 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.184451 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547bf6db69-wzvwk"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.184496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89621c5c-1d46-44be-852f-1a37dccf02e9","Type":"ContainerStarted","Data":"df2500bae23962e3728d43cfc79e984492a602cd453ccb3fe88cf2078b2e10e7"} Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.187385 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" event={"ID":"cae23a60-143f-4004-82ed-340963f9f971","Type":"ContainerDied","Data":"ff1def226d6c0901fbb9b4eaba7300a5f15a9e4a2f3fffdea1e618162dda3681"} Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.187541 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8d888b5-l4l8w" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.222987 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.248891 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.260506 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.263950 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-56xgl" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.264154 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.264281 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.267999 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.293291 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.347597 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77479b959-skc66"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.370340 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77479b959-skc66"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.384554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chv86\" (UniqueName: \"kubernetes.io/projected/854288a3-bb59-4721-b1ac-059920cd8c30-kube-api-access-chv86\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.384600 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854288a3-bb59-4721-b1ac-059920cd8c30-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.384626 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/854288a3-bb59-4721-b1ac-059920cd8c30-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.384672 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/854288a3-bb59-4721-b1ac-059920cd8c30-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.384693 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.384713 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/854288a3-bb59-4721-b1ac-059920cd8c30-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.384735 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/854288a3-bb59-4721-b1ac-059920cd8c30-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.384774 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854288a3-bb59-4721-b1ac-059920cd8c30-config\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.388754 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sl4xv"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.420615 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.441418 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8d888b5-l4l8w"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.446354 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b8d888b5-l4l8w"] Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.461324 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 13:23:22 crc kubenswrapper[4749]: W1001 13:23:22.476373 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb007c1f_6b53_4c8a_9921_85ccd3d5dad5.slice/crio-553c7ec203850eefdb39c05f2459a685f82d6cef434204ca61dc9e4c7c98fc6c WatchSource:0}: Error finding container 553c7ec203850eefdb39c05f2459a685f82d6cef434204ca61dc9e4c7c98fc6c: Status 404 returned error can't find the container with id 553c7ec203850eefdb39c05f2459a685f82d6cef434204ca61dc9e4c7c98fc6c Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.486503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854288a3-bb59-4721-b1ac-059920cd8c30-config\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.486594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chv86\" (UniqueName: \"kubernetes.io/projected/854288a3-bb59-4721-b1ac-059920cd8c30-kube-api-access-chv86\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.486625 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854288a3-bb59-4721-b1ac-059920cd8c30-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.486647 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/854288a3-bb59-4721-b1ac-059920cd8c30-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.486694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/854288a3-bb59-4721-b1ac-059920cd8c30-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.486713 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.486729 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/854288a3-bb59-4721-b1ac-059920cd8c30-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.486750 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/854288a3-bb59-4721-b1ac-059920cd8c30-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.487430 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/854288a3-bb59-4721-b1ac-059920cd8c30-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.487667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/854288a3-bb59-4721-b1ac-059920cd8c30-config\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.487753 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.488476 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/854288a3-bb59-4721-b1ac-059920cd8c30-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.493074 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/854288a3-bb59-4721-b1ac-059920cd8c30-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.493434 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/854288a3-bb59-4721-b1ac-059920cd8c30-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.493979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/854288a3-bb59-4721-b1ac-059920cd8c30-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.502076 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chv86\" (UniqueName: \"kubernetes.io/projected/854288a3-bb59-4721-b1ac-059920cd8c30-kube-api-access-chv86\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.533121 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"854288a3-bb59-4721-b1ac-059920cd8c30\") " pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.536383 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-75t94"] Oct 01 13:23:22 crc kubenswrapper[4749]: E1001 13:23:22.570418 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:38.102.83.30:5001/podified-master-centos10/openstack-ovn-base:watcher_latest,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nd4h95hch569h555hbch66hd7h5cch5b8h5d6h664h5b4h696h97h599h5c6h59fh685h6bh56hbdh64chf8h586h86hb9h64ch589h5dhfch59cq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nms4h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-75t94_openstack(45d86d96-b332-498e-a952-c34007c2f07b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 13:23:22 crc kubenswrapper[4749]: E1001 13:23:22.572526 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/ovn-controller-ovs-75t94" podUID="45d86d96-b332-498e-a952-c34007c2f07b" Oct 01 13:23:22 crc kubenswrapper[4749]: I1001 13:23:22.596814 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.120089 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 13:23:23 crc kubenswrapper[4749]: W1001 13:23:23.197348 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod854288a3_bb59_4721_b1ac_059920cd8c30.slice/crio-ae78c33912214d314b0334944bad0db447782ef2635f2d13bd5102f6e09acc88 WatchSource:0}: Error finding container ae78c33912214d314b0334944bad0db447782ef2635f2d13bd5102f6e09acc88: Status 404 returned error can't find the container with id ae78c33912214d314b0334944bad0db447782ef2635f2d13bd5102f6e09acc88 Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.208560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e2054662-5786-4a04-a7c9-16fe32a04610","Type":"ContainerStarted","Data":"b79164ed521ff520a0259cb9e0e955d30292cd01eeed3003bcd4980acce52cdd"} Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.210294 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0620852a-daa1-4b0a-91ea-910dd2c379c9","Type":"ContainerStarted","Data":"da8ce3baa560a4664d972965bc675dd5e94205d13d6beb6b9674932cb85d3b8f"} Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.212631 4749 generic.go:334] "Generic (PLEG): container finished" podID="a35c1692-bf33-4303-9783-4b734e8d5aa4" containerID="9e169d4dab141d47abeb3d0ffd7edff6b8a1e9624bd23de4d4e1617ca3ce7535" exitCode=0 Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.212724 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6749c445df-rrs9k" event={"ID":"a35c1692-bf33-4303-9783-4b734e8d5aa4","Type":"ContainerDied","Data":"9e169d4dab141d47abeb3d0ffd7edff6b8a1e9624bd23de4d4e1617ca3ce7535"} Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.212773 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6749c445df-rrs9k" event={"ID":"a35c1692-bf33-4303-9783-4b734e8d5aa4","Type":"ContainerStarted","Data":"ac4cf7933926f019c1d0d4454970086fe1c984ea50fa213103650e7956309b27"} Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.216376 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"215f6c11-74c0-4e5e-a39d-8af23dd5e4af","Type":"ContainerStarted","Data":"9c25ea6cd2259eb1f1a8ac489ff0a9aece06be3113108bdcb6dc8b3f4fa118f9"} Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.218058 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sl4xv" event={"ID":"4035d0d3-eeec-429f-b31e-ab4649ecf92a","Type":"ContainerStarted","Data":"d4a61d67e1cb9d522b82a19c8b7fb9591489f783ba858ef6da4905e9f7ab148f"} Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.222414 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-75t94" event={"ID":"45d86d96-b332-498e-a952-c34007c2f07b","Type":"ContainerStarted","Data":"31aa5dd06c8169a7088d47269cfdd0d627420e811d3b0d72ef34a1825e7dbe08"} Oct 01 13:23:23 crc kubenswrapper[4749]: E1001 13:23:23.223955 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.30:5001/podified-master-centos10/openstack-ovn-base:watcher_latest\\\"\"" pod="openstack/ovn-controller-ovs-75t94" podUID="45d86d96-b332-498e-a952-c34007c2f07b" Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.224375 4749 generic.go:334] "Generic (PLEG): container finished" podID="42ec794e-8dae-4b0a-a77c-1f2f56047842" containerID="28bee992374b3b331e90cd75e46afc645faa92ccf51e145366d58545f00f1fdd" exitCode=0 Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.224428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669746885c-bsc46" event={"ID":"42ec794e-8dae-4b0a-a77c-1f2f56047842","Type":"ContainerDied","Data":"28bee992374b3b331e90cd75e46afc645faa92ccf51e145366d58545f00f1fdd"} Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.224452 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669746885c-bsc46" event={"ID":"42ec794e-8dae-4b0a-a77c-1f2f56047842","Type":"ContainerStarted","Data":"522abbbe0783fc64b98dcab9396d6e6a5eb2316d58b8eb6652c2848c67cc75ba"} Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.234416 4749 generic.go:334] "Generic (PLEG): container finished" podID="4e05980f-053b-45c7-8421-4f025831fc1c" containerID="c9c82056938f79ac38e6e8e078c89ff84679dc126e3fade3a2426a7c0f2d5832" exitCode=0 Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.243055 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae23a60-143f-4004-82ed-340963f9f971" path="/var/lib/kubelet/pods/cae23a60-143f-4004-82ed-340963f9f971/volumes" Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.243442 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfafcf57-f586-4045-8de3-9047d545b210" path="/var/lib/kubelet/pods/dfafcf57-f586-4045-8de3-9047d545b210/volumes" Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.243911 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5","Type":"ContainerStarted","Data":"553c7ec203850eefdb39c05f2459a685f82d6cef434204ca61dc9e4c7c98fc6c"} Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.244093 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" event={"ID":"4e05980f-053b-45c7-8421-4f025831fc1c","Type":"ContainerDied","Data":"c9c82056938f79ac38e6e8e078c89ff84679dc126e3fade3a2426a7c0f2d5832"} Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.244122 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" event={"ID":"4e05980f-053b-45c7-8421-4f025831fc1c","Type":"ContainerStarted","Data":"471ed49a05166d55dbf52f3b39df134dc59372a684588edeae24292989bcd5c1"} Oct 01 13:23:23 crc kubenswrapper[4749]: I1001 13:23:23.244135 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b75b73e1-aac4-41c8-9ad7-afe216cf9741","Type":"ContainerStarted","Data":"3cbbdec86270de496b61d54efbc5e92cc7ff2126facf5cf2dfcc320d71b162ae"} Oct 01 13:23:24 crc kubenswrapper[4749]: I1001 13:23:24.243254 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"854288a3-bb59-4721-b1ac-059920cd8c30","Type":"ContainerStarted","Data":"ae78c33912214d314b0334944bad0db447782ef2635f2d13bd5102f6e09acc88"} Oct 01 13:23:24 crc kubenswrapper[4749]: E1001 13:23:24.276198 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.30:5001/podified-master-centos10/openstack-ovn-base:watcher_latest\\\"\"" pod="openstack/ovn-controller-ovs-75t94" podUID="45d86d96-b332-498e-a952-c34007c2f07b" Oct 01 13:23:24 crc kubenswrapper[4749]: I1001 13:23:24.351662 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:24 crc kubenswrapper[4749]: I1001 13:23:24.423627 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czbb4\" (UniqueName: \"kubernetes.io/projected/4e05980f-053b-45c7-8421-4f025831fc1c-kube-api-access-czbb4\") pod \"4e05980f-053b-45c7-8421-4f025831fc1c\" (UID: \"4e05980f-053b-45c7-8421-4f025831fc1c\") " Oct 01 13:23:24 crc kubenswrapper[4749]: I1001 13:23:24.423663 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e05980f-053b-45c7-8421-4f025831fc1c-dns-svc\") pod \"4e05980f-053b-45c7-8421-4f025831fc1c\" (UID: \"4e05980f-053b-45c7-8421-4f025831fc1c\") " Oct 01 13:23:24 crc kubenswrapper[4749]: I1001 13:23:24.423802 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e05980f-053b-45c7-8421-4f025831fc1c-config\") pod \"4e05980f-053b-45c7-8421-4f025831fc1c\" (UID: \"4e05980f-053b-45c7-8421-4f025831fc1c\") " Oct 01 13:23:24 crc kubenswrapper[4749]: I1001 13:23:24.429282 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e05980f-053b-45c7-8421-4f025831fc1c-kube-api-access-czbb4" (OuterVolumeSpecName: "kube-api-access-czbb4") pod "4e05980f-053b-45c7-8421-4f025831fc1c" (UID: "4e05980f-053b-45c7-8421-4f025831fc1c"). InnerVolumeSpecName "kube-api-access-czbb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:23:24 crc kubenswrapper[4749]: I1001 13:23:24.445135 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e05980f-053b-45c7-8421-4f025831fc1c-config" (OuterVolumeSpecName: "config") pod "4e05980f-053b-45c7-8421-4f025831fc1c" (UID: "4e05980f-053b-45c7-8421-4f025831fc1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:24 crc kubenswrapper[4749]: I1001 13:23:24.450037 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e05980f-053b-45c7-8421-4f025831fc1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e05980f-053b-45c7-8421-4f025831fc1c" (UID: "4e05980f-053b-45c7-8421-4f025831fc1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:24 crc kubenswrapper[4749]: I1001 13:23:24.526587 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czbb4\" (UniqueName: \"kubernetes.io/projected/4e05980f-053b-45c7-8421-4f025831fc1c-kube-api-access-czbb4\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:24 crc kubenswrapper[4749]: I1001 13:23:24.526621 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e05980f-053b-45c7-8421-4f025831fc1c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:24 crc kubenswrapper[4749]: I1001 13:23:24.526632 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e05980f-053b-45c7-8421-4f025831fc1c-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:25 crc kubenswrapper[4749]: I1001 13:23:25.251117 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" Oct 01 13:23:25 crc kubenswrapper[4749]: I1001 13:23:25.251092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-547bf6db69-wzvwk" event={"ID":"4e05980f-053b-45c7-8421-4f025831fc1c","Type":"ContainerDied","Data":"471ed49a05166d55dbf52f3b39df134dc59372a684588edeae24292989bcd5c1"} Oct 01 13:23:25 crc kubenswrapper[4749]: I1001 13:23:25.251203 4749 scope.go:117] "RemoveContainer" containerID="c9c82056938f79ac38e6e8e078c89ff84679dc126e3fade3a2426a7c0f2d5832" Oct 01 13:23:25 crc kubenswrapper[4749]: I1001 13:23:25.295012 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-547bf6db69-wzvwk"] Oct 01 13:23:25 crc kubenswrapper[4749]: I1001 13:23:25.299458 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-547bf6db69-wzvwk"] Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.341278 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-jfv7g"] Oct 01 13:23:26 crc kubenswrapper[4749]: E1001 13:23:26.342007 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e05980f-053b-45c7-8421-4f025831fc1c" containerName="init" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.342022 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e05980f-053b-45c7-8421-4f025831fc1c" containerName="init" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.342204 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e05980f-053b-45c7-8421-4f025831fc1c" containerName="init" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.342863 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.346451 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.346494 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jfv7g"] Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.462795 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-ovn-rundir\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.462844 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.462891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-config\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.462935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-combined-ca-bundle\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.462950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-ovs-rundir\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.462973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjstx\" (UniqueName: \"kubernetes.io/projected/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-kube-api-access-cjstx\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.480467 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-669746885c-bsc46"] Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.508679 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84c56b8c69-kwxfr"] Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.510871 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.513309 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.522005 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c56b8c69-kwxfr"] Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.564339 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.564411 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-config\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.564459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-combined-ca-bundle\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.564475 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-ovs-rundir\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.564951 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjstx\" (UniqueName: \"kubernetes.io/projected/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-kube-api-access-cjstx\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.565606 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-config\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.569314 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.571901 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-ovn-rundir\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.572207 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-ovs-rundir\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.572308 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-ovn-rundir\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.575576 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-combined-ca-bundle\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.590922 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjstx\" (UniqueName: \"kubernetes.io/projected/2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca-kube-api-access-cjstx\") pod \"ovn-controller-metrics-jfv7g\" (UID: \"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca\") " pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.593123 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6749c445df-rrs9k"] Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.628093 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5446c9d685-t66rg"] Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.631699 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.635157 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5446c9d685-t66rg"] Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.635335 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.672922 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwn5n\" (UniqueName: \"kubernetes.io/projected/130fc93c-20b8-484e-8ae8-dd1da0398609-kube-api-access-dwn5n\") pod \"dnsmasq-dns-84c56b8c69-kwxfr\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.673261 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-config\") pod \"dnsmasq-dns-84c56b8c69-kwxfr\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.673288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-ovsdbserver-nb\") pod \"dnsmasq-dns-84c56b8c69-kwxfr\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.673308 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-dns-svc\") pod \"dnsmasq-dns-84c56b8c69-kwxfr\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.690041 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-jfv7g" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.774915 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-ovsdbserver-sb\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.774982 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwn5n\" (UniqueName: \"kubernetes.io/projected/130fc93c-20b8-484e-8ae8-dd1da0398609-kube-api-access-dwn5n\") pod \"dnsmasq-dns-84c56b8c69-kwxfr\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.775044 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-config\") pod \"dnsmasq-dns-84c56b8c69-kwxfr\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.775080 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-ovsdbserver-nb\") pod \"dnsmasq-dns-84c56b8c69-kwxfr\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.775101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-dns-svc\") pod \"dnsmasq-dns-84c56b8c69-kwxfr\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.775138 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlxkq\" (UniqueName: \"kubernetes.io/projected/6cd3a926-2d20-442c-8c9f-8b1641848284-kube-api-access-vlxkq\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.775164 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-config\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.775202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-ovsdbserver-nb\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.775240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-dns-svc\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.776099 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-ovsdbserver-nb\") pod \"dnsmasq-dns-84c56b8c69-kwxfr\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.776814 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-dns-svc\") pod \"dnsmasq-dns-84c56b8c69-kwxfr\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.780641 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-config\") pod \"dnsmasq-dns-84c56b8c69-kwxfr\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.793823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwn5n\" (UniqueName: \"kubernetes.io/projected/130fc93c-20b8-484e-8ae8-dd1da0398609-kube-api-access-dwn5n\") pod \"dnsmasq-dns-84c56b8c69-kwxfr\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.829946 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.876848 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-ovsdbserver-sb\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.877864 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-ovsdbserver-sb\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.878083 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlxkq\" (UniqueName: \"kubernetes.io/projected/6cd3a926-2d20-442c-8c9f-8b1641848284-kube-api-access-vlxkq\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.878437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-config\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.879022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-config\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.880014 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-ovsdbserver-nb\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.880038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-dns-svc\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.880758 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-dns-svc\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.881124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-ovsdbserver-nb\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.899887 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlxkq\" (UniqueName: \"kubernetes.io/projected/6cd3a926-2d20-442c-8c9f-8b1641848284-kube-api-access-vlxkq\") pod \"dnsmasq-dns-5446c9d685-t66rg\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:26 crc kubenswrapper[4749]: I1001 13:23:26.960272 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:27 crc kubenswrapper[4749]: I1001 13:23:27.241616 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e05980f-053b-45c7-8421-4f025831fc1c" path="/var/lib/kubelet/pods/4e05980f-053b-45c7-8421-4f025831fc1c/volumes" Oct 01 13:23:30 crc kubenswrapper[4749]: I1001 13:23:30.606076 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-jfv7g"] Oct 01 13:23:30 crc kubenswrapper[4749]: I1001 13:23:30.671947 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c56b8c69-kwxfr"] Oct 01 13:23:30 crc kubenswrapper[4749]: W1001 13:23:30.827271 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod130fc93c_20b8_484e_8ae8_dd1da0398609.slice/crio-a5336f55279e0915d29849394bf4fa87278becbc8e9ecb7adc28b7bc8b88d2b1 WatchSource:0}: Error finding container a5336f55279e0915d29849394bf4fa87278becbc8e9ecb7adc28b7bc8b88d2b1: Status 404 returned error can't find the container with id a5336f55279e0915d29849394bf4fa87278becbc8e9ecb7adc28b7bc8b88d2b1 Oct 01 13:23:30 crc kubenswrapper[4749]: I1001 13:23:30.930271 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5446c9d685-t66rg"] Oct 01 13:23:31 crc kubenswrapper[4749]: W1001 13:23:31.039585 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cd3a926_2d20_442c_8c9f_8b1641848284.slice/crio-95942a3f75d2e4964e3d3e7aa42bd92ce1f6356059ec7d66b6756e9aaf1dd882 WatchSource:0}: Error finding container 95942a3f75d2e4964e3d3e7aa42bd92ce1f6356059ec7d66b6756e9aaf1dd882: Status 404 returned error can't find the container with id 95942a3f75d2e4964e3d3e7aa42bd92ce1f6356059ec7d66b6756e9aaf1dd882 Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.316266 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" event={"ID":"130fc93c-20b8-484e-8ae8-dd1da0398609","Type":"ContainerStarted","Data":"a5336f55279e0915d29849394bf4fa87278becbc8e9ecb7adc28b7bc8b88d2b1"} Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.318557 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jfv7g" event={"ID":"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca","Type":"ContainerStarted","Data":"af78ce17edcde0db9efc06cde1dd4693daa5befc4031af3a71c17c99d95e7826"} Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.320349 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" event={"ID":"6cd3a926-2d20-442c-8c9f-8b1641848284","Type":"ContainerStarted","Data":"95942a3f75d2e4964e3d3e7aa42bd92ce1f6356059ec7d66b6756e9aaf1dd882"} Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.334873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669746885c-bsc46" event={"ID":"42ec794e-8dae-4b0a-a77c-1f2f56047842","Type":"ContainerStarted","Data":"6c00cf412281cc45bf021a2f810a9903fa9c24c3082e7b80bd5e1ecaeb7c83f4"} Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.335205 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-669746885c-bsc46" podUID="42ec794e-8dae-4b0a-a77c-1f2f56047842" containerName="dnsmasq-dns" containerID="cri-o://6c00cf412281cc45bf021a2f810a9903fa9c24c3082e7b80bd5e1ecaeb7c83f4" gracePeriod=10 Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.335290 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.340927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09","Type":"ContainerStarted","Data":"b573fefa5bf702b7364fe7025a6d1575d49eef3634722b3e91dee8b562914c12"} Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.343171 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6749c445df-rrs9k" event={"ID":"a35c1692-bf33-4303-9783-4b734e8d5aa4","Type":"ContainerStarted","Data":"0e27f441f31cd49182d6c66b14303c84a1d8e9660a4a6a534166133c8e394604"} Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.343257 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.343256 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6749c445df-rrs9k" podUID="a35c1692-bf33-4303-9783-4b734e8d5aa4" containerName="dnsmasq-dns" containerID="cri-o://0e27f441f31cd49182d6c66b14303c84a1d8e9660a4a6a534166133c8e394604" gracePeriod=10 Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.532744 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6749c445df-rrs9k" podStartSLOduration=23.231659069 podStartE2EDuration="23.532727937s" podCreationTimestamp="2025-10-01 13:23:08 +0000 UTC" firstStartedPulling="2025-10-01 13:23:22.110303498 +0000 UTC m=+1062.164288397" lastFinishedPulling="2025-10-01 13:23:22.411372366 +0000 UTC m=+1062.465357265" observedRunningTime="2025-10-01 13:23:31.53210237 +0000 UTC m=+1071.586087279" watchObservedRunningTime="2025-10-01 13:23:31.532727937 +0000 UTC m=+1071.586712836" Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.556473 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.301635072 podStartE2EDuration="18.556453495s" podCreationTimestamp="2025-10-01 13:23:13 +0000 UTC" firstStartedPulling="2025-10-01 13:23:22.081669565 +0000 UTC m=+1062.135654464" lastFinishedPulling="2025-10-01 13:23:29.336487988 +0000 UTC m=+1069.390472887" observedRunningTime="2025-10-01 13:23:31.547628324 +0000 UTC m=+1071.601613233" watchObservedRunningTime="2025-10-01 13:23:31.556453495 +0000 UTC m=+1071.610438394" Oct 01 13:23:31 crc kubenswrapper[4749]: I1001 13:23:31.571277 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-669746885c-bsc46" podStartSLOduration=23.260458877 podStartE2EDuration="23.57125778s" podCreationTimestamp="2025-10-01 13:23:08 +0000 UTC" firstStartedPulling="2025-10-01 13:23:22.102356711 +0000 UTC m=+1062.156341610" lastFinishedPulling="2025-10-01 13:23:22.413155614 +0000 UTC m=+1062.467140513" observedRunningTime="2025-10-01 13:23:31.565322508 +0000 UTC m=+1071.619307427" watchObservedRunningTime="2025-10-01 13:23:31.57125778 +0000 UTC m=+1071.625242679" Oct 01 13:23:32 crc kubenswrapper[4749]: I1001 13:23:32.106105 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:23:32 crc kubenswrapper[4749]: I1001 13:23:32.106174 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:23:32 crc kubenswrapper[4749]: I1001 13:23:32.351403 4749 generic.go:334] "Generic (PLEG): container finished" podID="42ec794e-8dae-4b0a-a77c-1f2f56047842" containerID="6c00cf412281cc45bf021a2f810a9903fa9c24c3082e7b80bd5e1ecaeb7c83f4" exitCode=0 Oct 01 13:23:32 crc kubenswrapper[4749]: I1001 13:23:32.351456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669746885c-bsc46" event={"ID":"42ec794e-8dae-4b0a-a77c-1f2f56047842","Type":"ContainerDied","Data":"6c00cf412281cc45bf021a2f810a9903fa9c24c3082e7b80bd5e1ecaeb7c83f4"} Oct 01 13:23:32 crc kubenswrapper[4749]: I1001 13:23:32.353508 4749 generic.go:334] "Generic (PLEG): container finished" podID="a35c1692-bf33-4303-9783-4b734e8d5aa4" containerID="0e27f441f31cd49182d6c66b14303c84a1d8e9660a4a6a534166133c8e394604" exitCode=0 Oct 01 13:23:32 crc kubenswrapper[4749]: I1001 13:23:32.353596 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6749c445df-rrs9k" event={"ID":"a35c1692-bf33-4303-9783-4b734e8d5aa4","Type":"ContainerDied","Data":"0e27f441f31cd49182d6c66b14303c84a1d8e9660a4a6a534166133c8e394604"} Oct 01 13:23:32 crc kubenswrapper[4749]: I1001 13:23:32.353712 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.005582 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.121938 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ec794e-8dae-4b0a-a77c-1f2f56047842-dns-svc\") pod \"42ec794e-8dae-4b0a-a77c-1f2f56047842\" (UID: \"42ec794e-8dae-4b0a-a77c-1f2f56047842\") " Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.122043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjg6g\" (UniqueName: \"kubernetes.io/projected/42ec794e-8dae-4b0a-a77c-1f2f56047842-kube-api-access-qjg6g\") pod \"42ec794e-8dae-4b0a-a77c-1f2f56047842\" (UID: \"42ec794e-8dae-4b0a-a77c-1f2f56047842\") " Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.122129 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ec794e-8dae-4b0a-a77c-1f2f56047842-config\") pod \"42ec794e-8dae-4b0a-a77c-1f2f56047842\" (UID: \"42ec794e-8dae-4b0a-a77c-1f2f56047842\") " Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.235187 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ec794e-8dae-4b0a-a77c-1f2f56047842-kube-api-access-qjg6g" (OuterVolumeSpecName: "kube-api-access-qjg6g") pod "42ec794e-8dae-4b0a-a77c-1f2f56047842" (UID: "42ec794e-8dae-4b0a-a77c-1f2f56047842"). InnerVolumeSpecName "kube-api-access-qjg6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.256621 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ec794e-8dae-4b0a-a77c-1f2f56047842-config" (OuterVolumeSpecName: "config") pod "42ec794e-8dae-4b0a-a77c-1f2f56047842" (UID: "42ec794e-8dae-4b0a-a77c-1f2f56047842"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.274350 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42ec794e-8dae-4b0a-a77c-1f2f56047842-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42ec794e-8dae-4b0a-a77c-1f2f56047842" (UID: "42ec794e-8dae-4b0a-a77c-1f2f56047842"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.326303 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42ec794e-8dae-4b0a-a77c-1f2f56047842-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.326330 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjg6g\" (UniqueName: \"kubernetes.io/projected/42ec794e-8dae-4b0a-a77c-1f2f56047842-kube-api-access-qjg6g\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.326343 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42ec794e-8dae-4b0a-a77c-1f2f56047842-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.342286 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.379208 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-669746885c-bsc46" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.379836 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-669746885c-bsc46" event={"ID":"42ec794e-8dae-4b0a-a77c-1f2f56047842","Type":"ContainerDied","Data":"522abbbe0783fc64b98dcab9396d6e6a5eb2316d58b8eb6652c2848c67cc75ba"} Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.379967 4749 scope.go:117] "RemoveContainer" containerID="6c00cf412281cc45bf021a2f810a9903fa9c24c3082e7b80bd5e1ecaeb7c83f4" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.395670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89621c5c-1d46-44be-852f-1a37dccf02e9","Type":"ContainerStarted","Data":"6432798b0d8b723cf7c554daa2ac158cb3aa44a8340879d92c784631824689c2"} Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.401789 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6749c445df-rrs9k" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.402233 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6749c445df-rrs9k" event={"ID":"a35c1692-bf33-4303-9783-4b734e8d5aa4","Type":"ContainerDied","Data":"ac4cf7933926f019c1d0d4454970086fe1c984ea50fa213103650e7956309b27"} Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.496621 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-669746885c-bsc46"] Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.502285 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-669746885c-bsc46"] Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.531936 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a35c1692-bf33-4303-9783-4b734e8d5aa4-dns-svc\") pod \"a35c1692-bf33-4303-9783-4b734e8d5aa4\" (UID: \"a35c1692-bf33-4303-9783-4b734e8d5aa4\") " Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.532064 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35c1692-bf33-4303-9783-4b734e8d5aa4-config\") pod \"a35c1692-bf33-4303-9783-4b734e8d5aa4\" (UID: \"a35c1692-bf33-4303-9783-4b734e8d5aa4\") " Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.532147 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwlcn\" (UniqueName: \"kubernetes.io/projected/a35c1692-bf33-4303-9783-4b734e8d5aa4-kube-api-access-nwlcn\") pod \"a35c1692-bf33-4303-9783-4b734e8d5aa4\" (UID: \"a35c1692-bf33-4303-9783-4b734e8d5aa4\") " Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.543653 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a35c1692-bf33-4303-9783-4b734e8d5aa4-kube-api-access-nwlcn" (OuterVolumeSpecName: "kube-api-access-nwlcn") pod "a35c1692-bf33-4303-9783-4b734e8d5aa4" (UID: "a35c1692-bf33-4303-9783-4b734e8d5aa4"). InnerVolumeSpecName "kube-api-access-nwlcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.633918 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwlcn\" (UniqueName: \"kubernetes.io/projected/a35c1692-bf33-4303-9783-4b734e8d5aa4-kube-api-access-nwlcn\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.712891 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35c1692-bf33-4303-9783-4b734e8d5aa4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a35c1692-bf33-4303-9783-4b734e8d5aa4" (UID: "a35c1692-bf33-4303-9783-4b734e8d5aa4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.717300 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35c1692-bf33-4303-9783-4b734e8d5aa4-config" (OuterVolumeSpecName: "config") pod "a35c1692-bf33-4303-9783-4b734e8d5aa4" (UID: "a35c1692-bf33-4303-9783-4b734e8d5aa4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.736383 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35c1692-bf33-4303-9783-4b734e8d5aa4-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:33 crc kubenswrapper[4749]: I1001 13:23:33.736435 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a35c1692-bf33-4303-9783-4b734e8d5aa4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.031459 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6749c445df-rrs9k"] Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.040390 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6749c445df-rrs9k"] Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.412158 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sl4xv" event={"ID":"4035d0d3-eeec-429f-b31e-ab4649ecf92a","Type":"ContainerStarted","Data":"7b2fff159725370ff7a93d9163a8edf531cad0ecf2122dde8f77b5a12f07a86a"} Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.413828 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b75b73e1-aac4-41c8-9ad7-afe216cf9741","Type":"ContainerStarted","Data":"3bbd737a66fbee0146ce03b6268bbfd82c35f956a8bbabe033fa7936eed6bae9"} Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.414926 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-sl4xv" Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.420897 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0620852a-daa1-4b0a-91ea-910dd2c379c9","Type":"ContainerStarted","Data":"b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed"} Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.423777 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5","Type":"ContainerStarted","Data":"4a7bc0858ba12b29d8020ca1d26b825249e08cc9451a67d521cf104d0a98c633"} Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.428435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"215f6c11-74c0-4e5e-a39d-8af23dd5e4af","Type":"ContainerStarted","Data":"599a42c65d8ed297e5a770521c19a580ce8e034157cc6acf902544ac51f7d67f"} Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.431211 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"854288a3-bb59-4721-b1ac-059920cd8c30","Type":"ContainerStarted","Data":"3cc7fc6312814283fd3ccbd3f4c30b7bb463383245ef2c351fb378e3815fbd45"} Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.432962 4749 generic.go:334] "Generic (PLEG): container finished" podID="130fc93c-20b8-484e-8ae8-dd1da0398609" containerID="072d6e6644f5ff84e20533a3d4aa31a709a4898ca4f9ad30e816ed7b48038e19" exitCode=0 Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.433018 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" event={"ID":"130fc93c-20b8-484e-8ae8-dd1da0398609","Type":"ContainerDied","Data":"072d6e6644f5ff84e20533a3d4aa31a709a4898ca4f9ad30e816ed7b48038e19"} Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.434851 4749 generic.go:334] "Generic (PLEG): container finished" podID="6cd3a926-2d20-442c-8c9f-8b1641848284" containerID="a4fbc9f6e3df906bf71449cb2a2989385add840041052eb23c1441ee25ad97dd" exitCode=0 Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.434898 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" event={"ID":"6cd3a926-2d20-442c-8c9f-8b1641848284","Type":"ContainerDied","Data":"a4fbc9f6e3df906bf71449cb2a2989385add840041052eb23c1441ee25ad97dd"} Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.441592 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sl4xv" podStartSLOduration=7.691148024 podStartE2EDuration="15.44156371s" podCreationTimestamp="2025-10-01 13:23:19 +0000 UTC" firstStartedPulling="2025-10-01 13:23:22.400045006 +0000 UTC m=+1062.454029905" lastFinishedPulling="2025-10-01 13:23:30.150460702 +0000 UTC m=+1070.204445591" observedRunningTime="2025-10-01 13:23:34.427976689 +0000 UTC m=+1074.481961588" watchObservedRunningTime="2025-10-01 13:23:34.44156371 +0000 UTC m=+1074.495548619" Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.729583 4749 scope.go:117] "RemoveContainer" containerID="28bee992374b3b331e90cd75e46afc645faa92ccf51e145366d58545f00f1fdd" Oct 01 13:23:34 crc kubenswrapper[4749]: I1001 13:23:34.962400 4749 scope.go:117] "RemoveContainer" containerID="0e27f441f31cd49182d6c66b14303c84a1d8e9660a4a6a534166133c8e394604" Oct 01 13:23:35 crc kubenswrapper[4749]: I1001 13:23:35.053533 4749 scope.go:117] "RemoveContainer" containerID="9e169d4dab141d47abeb3d0ffd7edff6b8a1e9624bd23de4d4e1617ca3ce7535" Oct 01 13:23:35 crc kubenswrapper[4749]: I1001 13:23:35.246311 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ec794e-8dae-4b0a-a77c-1f2f56047842" path="/var/lib/kubelet/pods/42ec794e-8dae-4b0a-a77c-1f2f56047842/volumes" Oct 01 13:23:35 crc kubenswrapper[4749]: I1001 13:23:35.247021 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a35c1692-bf33-4303-9783-4b734e8d5aa4" path="/var/lib/kubelet/pods/a35c1692-bf33-4303-9783-4b734e8d5aa4/volumes" Oct 01 13:23:35 crc kubenswrapper[4749]: I1001 13:23:35.443570 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35e1759f-e27a-4891-9fc0-37753b25689d","Type":"ContainerStarted","Data":"be178b2e3c636505e7559b1f4f4b2b933f257fb8e1218e28feced6d711498217"} Oct 01 13:23:35 crc kubenswrapper[4749]: I1001 13:23:35.448272 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" event={"ID":"130fc93c-20b8-484e-8ae8-dd1da0398609","Type":"ContainerStarted","Data":"c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1"} Oct 01 13:23:35 crc kubenswrapper[4749]: I1001 13:23:35.451731 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e2054662-5786-4a04-a7c9-16fe32a04610","Type":"ContainerStarted","Data":"2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090"} Oct 01 13:23:35 crc kubenswrapper[4749]: I1001 13:23:35.453127 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"1d655140-d63d-4e40-8de1-875213f37d4a","Type":"ContainerStarted","Data":"3da935e241d19c118277a2ebfd988ad64a4c330d0fc5f8743768b13613da5ec6"} Oct 01 13:23:35 crc kubenswrapper[4749]: I1001 13:23:35.455799 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"854288a3-bb59-4721-b1ac-059920cd8c30","Type":"ContainerStarted","Data":"781c6de46e954a5593f24a5074e0e0a4920c78d38c7a21a65b950a40a36fed4d"} Oct 01 13:23:35 crc kubenswrapper[4749]: I1001 13:23:35.465467 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" event={"ID":"6cd3a926-2d20-442c-8c9f-8b1641848284","Type":"ContainerStarted","Data":"f03b3ee0b148ce078a7b2628173db46df1b8829f714e79cbf5a2fcdc63e4c2f1"} Oct 01 13:23:35 crc kubenswrapper[4749]: I1001 13:23:35.466306 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:35 crc kubenswrapper[4749]: I1001 13:23:35.508554 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" podStartSLOduration=9.508531999 podStartE2EDuration="9.508531999s" podCreationTimestamp="2025-10-01 13:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:23:35.50380502 +0000 UTC m=+1075.557789919" watchObservedRunningTime="2025-10-01 13:23:35.508531999 +0000 UTC m=+1075.562516898" Oct 01 13:23:36 crc kubenswrapper[4749]: I1001 13:23:36.481792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-jfv7g" event={"ID":"2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca","Type":"ContainerStarted","Data":"544c3bce9a40d9833c13752e916e2d64e125cf318113d65b0063d6b8ac7fbea1"} Oct 01 13:23:36 crc kubenswrapper[4749]: I1001 13:23:36.486608 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eb007c1f-6b53-4c8a-9921-85ccd3d5dad5","Type":"ContainerStarted","Data":"941ca3f8beb33fcc1bfbfbc0359741dd19d691ed235d9fda9c3a4972ae94b1a2"} Oct 01 13:23:36 crc kubenswrapper[4749]: I1001 13:23:36.512524 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.200360836 podStartE2EDuration="22.512494636s" podCreationTimestamp="2025-10-01 13:23:14 +0000 UTC" firstStartedPulling="2025-10-01 13:23:22.192461263 +0000 UTC m=+1062.246446162" lastFinishedPulling="2025-10-01 13:23:34.504595063 +0000 UTC m=+1074.558579962" observedRunningTime="2025-10-01 13:23:36.509777662 +0000 UTC m=+1076.563762601" watchObservedRunningTime="2025-10-01 13:23:36.512494636 +0000 UTC m=+1076.566479575" Oct 01 13:23:36 crc kubenswrapper[4749]: I1001 13:23:36.545278 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.663654966 podStartE2EDuration="15.545252591s" podCreationTimestamp="2025-10-01 13:23:21 +0000 UTC" firstStartedPulling="2025-10-01 13:23:23.207319037 +0000 UTC m=+1063.261303936" lastFinishedPulling="2025-10-01 13:23:35.088916662 +0000 UTC m=+1075.142901561" observedRunningTime="2025-10-01 13:23:36.53678251 +0000 UTC m=+1076.590767449" watchObservedRunningTime="2025-10-01 13:23:36.545252591 +0000 UTC m=+1076.599237530" Oct 01 13:23:36 crc kubenswrapper[4749]: I1001 13:23:36.570661 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" podStartSLOduration=10.570635585 podStartE2EDuration="10.570635585s" podCreationTimestamp="2025-10-01 13:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:23:36.565949567 +0000 UTC m=+1076.619934506" watchObservedRunningTime="2025-10-01 13:23:36.570635585 +0000 UTC m=+1076.624620524" Oct 01 13:23:36 crc kubenswrapper[4749]: I1001 13:23:36.830510 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:37 crc kubenswrapper[4749]: I1001 13:23:37.529307 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-jfv7g" podStartSLOduration=7.098375753 podStartE2EDuration="11.529285593s" podCreationTimestamp="2025-10-01 13:23:26 +0000 UTC" firstStartedPulling="2025-10-01 13:23:30.654031613 +0000 UTC m=+1070.708016502" lastFinishedPulling="2025-10-01 13:23:35.084941443 +0000 UTC m=+1075.138926342" observedRunningTime="2025-10-01 13:23:37.513036619 +0000 UTC m=+1077.567021548" watchObservedRunningTime="2025-10-01 13:23:37.529285593 +0000 UTC m=+1077.583270502" Oct 01 13:23:37 crc kubenswrapper[4749]: I1001 13:23:37.598582 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:37 crc kubenswrapper[4749]: I1001 13:23:37.598633 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:37 crc kubenswrapper[4749]: I1001 13:23:37.648110 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:38 crc kubenswrapper[4749]: I1001 13:23:38.397963 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 01 13:23:38 crc kubenswrapper[4749]: I1001 13:23:38.570016 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 01 13:23:41 crc kubenswrapper[4749]: I1001 13:23:41.578162 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=10.899810025 podStartE2EDuration="23.578126752s" podCreationTimestamp="2025-10-01 13:23:18 +0000 UTC" firstStartedPulling="2025-10-01 13:23:22.496977155 +0000 UTC m=+1062.550962054" lastFinishedPulling="2025-10-01 13:23:35.175293882 +0000 UTC m=+1075.229278781" observedRunningTime="2025-10-01 13:23:41.56781441 +0000 UTC m=+1081.621799329" watchObservedRunningTime="2025-10-01 13:23:41.578126752 +0000 UTC m=+1081.632111711" Oct 01 13:23:41 crc kubenswrapper[4749]: I1001 13:23:41.832994 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:41 crc kubenswrapper[4749]: I1001 13:23:41.961422 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:42 crc kubenswrapper[4749]: I1001 13:23:42.013499 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c56b8c69-kwxfr"] Oct 01 13:23:42 crc kubenswrapper[4749]: I1001 13:23:42.549435 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-75t94" event={"ID":"45d86d96-b332-498e-a952-c34007c2f07b","Type":"ContainerStarted","Data":"d803d6aafb396355d3b2513a55c7728d0c275b6f6c0bb123db6c706ca4225a87"} Oct 01 13:23:42 crc kubenswrapper[4749]: I1001 13:23:42.552395 4749 generic.go:334] "Generic (PLEG): container finished" podID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerID="b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed" exitCode=0 Oct 01 13:23:42 crc kubenswrapper[4749]: I1001 13:23:42.552526 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0620852a-daa1-4b0a-91ea-910dd2c379c9","Type":"ContainerDied","Data":"b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed"} Oct 01 13:23:42 crc kubenswrapper[4749]: I1001 13:23:42.552639 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" podUID="130fc93c-20b8-484e-8ae8-dd1da0398609" containerName="dnsmasq-dns" containerID="cri-o://c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1" gracePeriod=10 Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.099366 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.225651 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwn5n\" (UniqueName: \"kubernetes.io/projected/130fc93c-20b8-484e-8ae8-dd1da0398609-kube-api-access-dwn5n\") pod \"130fc93c-20b8-484e-8ae8-dd1da0398609\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.225714 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-dns-svc\") pod \"130fc93c-20b8-484e-8ae8-dd1da0398609\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.225767 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-ovsdbserver-nb\") pod \"130fc93c-20b8-484e-8ae8-dd1da0398609\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.225822 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-config\") pod \"130fc93c-20b8-484e-8ae8-dd1da0398609\" (UID: \"130fc93c-20b8-484e-8ae8-dd1da0398609\") " Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.232418 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/130fc93c-20b8-484e-8ae8-dd1da0398609-kube-api-access-dwn5n" (OuterVolumeSpecName: "kube-api-access-dwn5n") pod "130fc93c-20b8-484e-8ae8-dd1da0398609" (UID: "130fc93c-20b8-484e-8ae8-dd1da0398609"). InnerVolumeSpecName "kube-api-access-dwn5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.265597 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-config" (OuterVolumeSpecName: "config") pod "130fc93c-20b8-484e-8ae8-dd1da0398609" (UID: "130fc93c-20b8-484e-8ae8-dd1da0398609"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.274732 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "130fc93c-20b8-484e-8ae8-dd1da0398609" (UID: "130fc93c-20b8-484e-8ae8-dd1da0398609"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.277637 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "130fc93c-20b8-484e-8ae8-dd1da0398609" (UID: "130fc93c-20b8-484e-8ae8-dd1da0398609"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.328167 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwn5n\" (UniqueName: \"kubernetes.io/projected/130fc93c-20b8-484e-8ae8-dd1da0398609-kube-api-access-dwn5n\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.328201 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.328232 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.328245 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/130fc93c-20b8-484e-8ae8-dd1da0398609-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.564164 4749 generic.go:334] "Generic (PLEG): container finished" podID="130fc93c-20b8-484e-8ae8-dd1da0398609" containerID="c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1" exitCode=0 Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.564280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" event={"ID":"130fc93c-20b8-484e-8ae8-dd1da0398609","Type":"ContainerDied","Data":"c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1"} Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.564371 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" event={"ID":"130fc93c-20b8-484e-8ae8-dd1da0398609","Type":"ContainerDied","Data":"a5336f55279e0915d29849394bf4fa87278becbc8e9ecb7adc28b7bc8b88d2b1"} Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.564292 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c56b8c69-kwxfr" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.564416 4749 scope.go:117] "RemoveContainer" containerID="c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.566883 4749 generic.go:334] "Generic (PLEG): container finished" podID="45d86d96-b332-498e-a952-c34007c2f07b" containerID="d803d6aafb396355d3b2513a55c7728d0c275b6f6c0bb123db6c706ca4225a87" exitCode=0 Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.566927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-75t94" event={"ID":"45d86d96-b332-498e-a952-c34007c2f07b","Type":"ContainerDied","Data":"d803d6aafb396355d3b2513a55c7728d0c275b6f6c0bb123db6c706ca4225a87"} Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.626100 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c56b8c69-kwxfr"] Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.633398 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.641945 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84c56b8c69-kwxfr"] Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.649376 4749 scope.go:117] "RemoveContainer" containerID="072d6e6644f5ff84e20533a3d4aa31a709a4898ca4f9ad30e816ed7b48038e19" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.688068 4749 scope.go:117] "RemoveContainer" containerID="c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1" Oct 01 13:23:43 crc kubenswrapper[4749]: E1001 13:23:43.688990 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1\": container with ID starting with c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1 not found: ID does not exist" containerID="c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.689045 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1"} err="failed to get container status \"c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1\": rpc error: code = NotFound desc = could not find container \"c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1\": container with ID starting with c6859a1924ad41ae03dd876fde7ac46580dd053417c96ecf2ef3c316e62315d1 not found: ID does not exist" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.689076 4749 scope.go:117] "RemoveContainer" containerID="072d6e6644f5ff84e20533a3d4aa31a709a4898ca4f9ad30e816ed7b48038e19" Oct 01 13:23:43 crc kubenswrapper[4749]: E1001 13:23:43.689480 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"072d6e6644f5ff84e20533a3d4aa31a709a4898ca4f9ad30e816ed7b48038e19\": container with ID starting with 072d6e6644f5ff84e20533a3d4aa31a709a4898ca4f9ad30e816ed7b48038e19 not found: ID does not exist" containerID="072d6e6644f5ff84e20533a3d4aa31a709a4898ca4f9ad30e816ed7b48038e19" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.689518 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"072d6e6644f5ff84e20533a3d4aa31a709a4898ca4f9ad30e816ed7b48038e19"} err="failed to get container status \"072d6e6644f5ff84e20533a3d4aa31a709a4898ca4f9ad30e816ed7b48038e19\": rpc error: code = NotFound desc = could not find container \"072d6e6644f5ff84e20533a3d4aa31a709a4898ca4f9ad30e816ed7b48038e19\": container with ID starting with 072d6e6644f5ff84e20533a3d4aa31a709a4898ca4f9ad30e816ed7b48038e19 not found: ID does not exist" Oct 01 13:23:43 crc kubenswrapper[4749]: I1001 13:23:43.706955 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.582485 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-75t94" event={"ID":"45d86d96-b332-498e-a952-c34007c2f07b","Type":"ContainerStarted","Data":"ab39b15e3b3aa687b6cf4a40c9d831312a7940393c51bf6f26d7f8c29d01c9c3"} Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.582942 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.582963 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-75t94" event={"ID":"45d86d96-b332-498e-a952-c34007c2f07b","Type":"ContainerStarted","Data":"8c18a99951fe33646c7ac3d458e08d07133a6d163c47642b4f8e3e8392551832"} Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.583233 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.583303 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.616489 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-75t94" podStartSLOduration=6.177282614 podStartE2EDuration="25.616469895s" podCreationTimestamp="2025-10-01 13:23:19 +0000 UTC" firstStartedPulling="2025-10-01 13:23:22.570191366 +0000 UTC m=+1062.624176265" lastFinishedPulling="2025-10-01 13:23:42.009378647 +0000 UTC m=+1082.063363546" observedRunningTime="2025-10-01 13:23:44.612921198 +0000 UTC m=+1084.666906117" watchObservedRunningTime="2025-10-01 13:23:44.616469895 +0000 UTC m=+1084.670454804" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.646826 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.789938 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 01 13:23:44 crc kubenswrapper[4749]: E1001 13:23:44.790352 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130fc93c-20b8-484e-8ae8-dd1da0398609" containerName="dnsmasq-dns" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.790372 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="130fc93c-20b8-484e-8ae8-dd1da0398609" containerName="dnsmasq-dns" Oct 01 13:23:44 crc kubenswrapper[4749]: E1001 13:23:44.790398 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ec794e-8dae-4b0a-a77c-1f2f56047842" containerName="init" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.790406 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ec794e-8dae-4b0a-a77c-1f2f56047842" containerName="init" Oct 01 13:23:44 crc kubenswrapper[4749]: E1001 13:23:44.790416 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ec794e-8dae-4b0a-a77c-1f2f56047842" containerName="dnsmasq-dns" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.790425 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ec794e-8dae-4b0a-a77c-1f2f56047842" containerName="dnsmasq-dns" Oct 01 13:23:44 crc kubenswrapper[4749]: E1001 13:23:44.790447 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a35c1692-bf33-4303-9783-4b734e8d5aa4" containerName="dnsmasq-dns" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.790454 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35c1692-bf33-4303-9783-4b734e8d5aa4" containerName="dnsmasq-dns" Oct 01 13:23:44 crc kubenswrapper[4749]: E1001 13:23:44.790469 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a35c1692-bf33-4303-9783-4b734e8d5aa4" containerName="init" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.790477 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35c1692-bf33-4303-9783-4b734e8d5aa4" containerName="init" Oct 01 13:23:44 crc kubenswrapper[4749]: E1001 13:23:44.790491 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="130fc93c-20b8-484e-8ae8-dd1da0398609" containerName="init" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.790499 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="130fc93c-20b8-484e-8ae8-dd1da0398609" containerName="init" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.790687 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="130fc93c-20b8-484e-8ae8-dd1da0398609" containerName="dnsmasq-dns" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.790724 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ec794e-8dae-4b0a-a77c-1f2f56047842" containerName="dnsmasq-dns" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.790754 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a35c1692-bf33-4303-9783-4b734e8d5aa4" containerName="dnsmasq-dns" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.791766 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.838891 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.838999 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.839154 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-wtqt7" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.839242 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.846717 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.960464 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/78021fea-966d-45d2-8816-265437360e8f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.960603 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78021fea-966d-45d2-8816-265437360e8f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.960713 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78021fea-966d-45d2-8816-265437360e8f-scripts\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.960773 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzfrm\" (UniqueName: \"kubernetes.io/projected/78021fea-966d-45d2-8816-265437360e8f-kube-api-access-lzfrm\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.960823 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78021fea-966d-45d2-8816-265437360e8f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.960871 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78021fea-966d-45d2-8816-265437360e8f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:44 crc kubenswrapper[4749]: I1001 13:23:44.960935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78021fea-966d-45d2-8816-265437360e8f-config\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.067459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78021fea-966d-45d2-8816-265437360e8f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.067562 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78021fea-966d-45d2-8816-265437360e8f-scripts\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.067615 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzfrm\" (UniqueName: \"kubernetes.io/projected/78021fea-966d-45d2-8816-265437360e8f-kube-api-access-lzfrm\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.067651 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78021fea-966d-45d2-8816-265437360e8f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.067707 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78021fea-966d-45d2-8816-265437360e8f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.067774 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78021fea-966d-45d2-8816-265437360e8f-config\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.067858 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/78021fea-966d-45d2-8816-265437360e8f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.069184 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78021fea-966d-45d2-8816-265437360e8f-scripts\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.069327 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78021fea-966d-45d2-8816-265437360e8f-config\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.069610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/78021fea-966d-45d2-8816-265437360e8f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.079986 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78021fea-966d-45d2-8816-265437360e8f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.080092 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/78021fea-966d-45d2-8816-265437360e8f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.095130 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/78021fea-966d-45d2-8816-265437360e8f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.100104 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzfrm\" (UniqueName: \"kubernetes.io/projected/78021fea-966d-45d2-8816-265437360e8f-kube-api-access-lzfrm\") pod \"ovn-northd-0\" (UID: \"78021fea-966d-45d2-8816-265437360e8f\") " pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.108794 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666cf554b5-49snv"] Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.115714 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.118937 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666cf554b5-49snv"] Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.158393 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.208308 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.214603 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.242568 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="130fc93c-20b8-484e-8ae8-dd1da0398609" path="/var/lib/kubelet/pods/130fc93c-20b8-484e-8ae8-dd1da0398609/volumes" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.272628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-config\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.272665 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-ovsdbserver-sb\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.272696 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c686n\" (UniqueName: \"kubernetes.io/projected/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-kube-api-access-c686n\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.272725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-dns-svc\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.272753 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-ovsdbserver-nb\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.375138 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-dns-svc\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.375197 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-ovsdbserver-nb\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.375409 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-config\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.375436 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-ovsdbserver-sb\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.375467 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c686n\" (UniqueName: \"kubernetes.io/projected/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-kube-api-access-c686n\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.376452 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-ovsdbserver-nb\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.376983 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-config\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.377282 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-dns-svc\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.377380 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-ovsdbserver-sb\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.400262 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c686n\" (UniqueName: \"kubernetes.io/projected/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-kube-api-access-c686n\") pod \"dnsmasq-dns-666cf554b5-49snv\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.461180 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.617141 4749 generic.go:334] "Generic (PLEG): container finished" podID="b75b73e1-aac4-41c8-9ad7-afe216cf9741" containerID="3bbd737a66fbee0146ce03b6268bbfd82c35f956a8bbabe033fa7936eed6bae9" exitCode=0 Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.617339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b75b73e1-aac4-41c8-9ad7-afe216cf9741","Type":"ContainerDied","Data":"3bbd737a66fbee0146ce03b6268bbfd82c35f956a8bbabe033fa7936eed6bae9"} Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.625641 4749 generic.go:334] "Generic (PLEG): container finished" podID="215f6c11-74c0-4e5e-a39d-8af23dd5e4af" containerID="599a42c65d8ed297e5a770521c19a580ce8e034157cc6acf902544ac51f7d67f" exitCode=0 Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.626554 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"215f6c11-74c0-4e5e-a39d-8af23dd5e4af","Type":"ContainerDied","Data":"599a42c65d8ed297e5a770521c19a580ce8e034157cc6acf902544ac51f7d67f"} Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.743124 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 13:23:45 crc kubenswrapper[4749]: I1001 13:23:45.995913 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666cf554b5-49snv"] Oct 01 13:23:46 crc kubenswrapper[4749]: W1001 13:23:46.004248 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb8d09d5_0c47_4f41_b06f_8915f1dd0676.slice/crio-00221eafe79c6e1559b94d85f5ae71d09cd5241ff2901be48eced96abdb516dd WatchSource:0}: Error finding container 00221eafe79c6e1559b94d85f5ae71d09cd5241ff2901be48eced96abdb516dd: Status 404 returned error can't find the container with id 00221eafe79c6e1559b94d85f5ae71d09cd5241ff2901be48eced96abdb516dd Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.245196 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.254337 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.256489 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.256697 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.256857 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-9r99j" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.257035 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.261751 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.425098 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-cache\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.425406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.425438 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-lock\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.425456 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.425509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjfwn\" (UniqueName: \"kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-kube-api-access-rjfwn\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.527230 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.527305 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-lock\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.527332 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.527442 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjfwn\" (UniqueName: \"kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-kube-api-access-rjfwn\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.527478 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.527746 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-lock\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: E1001 13:23:46.527841 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 13:23:46 crc kubenswrapper[4749]: E1001 13:23:46.527856 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.527943 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-cache\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: E1001 13:23:46.528017 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift podName:6ef9f56d-2299-424f-9cc3-21cd7fcae8c1 nodeName:}" failed. No retries permitted until 2025-10-01 13:23:47.027949363 +0000 UTC m=+1087.081934262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift") pod "swift-storage-0" (UID: "6ef9f56d-2299-424f-9cc3-21cd7fcae8c1") : configmap "swift-ring-files" not found Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.528393 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-cache\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.547010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjfwn\" (UniqueName: \"kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-kube-api-access-rjfwn\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.550567 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.641292 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"215f6c11-74c0-4e5e-a39d-8af23dd5e4af","Type":"ContainerStarted","Data":"9902e3d6788a98a3397228f4c628a94fe383ebb591c285241afd1c270e7599e5"} Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.645432 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"78021fea-966d-45d2-8816-265437360e8f","Type":"ContainerStarted","Data":"1196214bbdbeee8e7162cc08a0aa59d56741a3fb4913cb9c331cf7de722aab8a"} Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.647753 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"b75b73e1-aac4-41c8-9ad7-afe216cf9741","Type":"ContainerStarted","Data":"8e160be35708691a85e38c50dd1ae2eb01ea453044cafd509f290bdfa17aa540"} Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.659112 4749 generic.go:334] "Generic (PLEG): container finished" podID="cb8d09d5-0c47-4f41-b06f-8915f1dd0676" containerID="7ccb01cd0f2ac61467ac856afe5e6481da5b3608d438a6bed589faf49ab54e8e" exitCode=0 Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.659352 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666cf554b5-49snv" event={"ID":"cb8d09d5-0c47-4f41-b06f-8915f1dd0676","Type":"ContainerDied","Data":"7ccb01cd0f2ac61467ac856afe5e6481da5b3608d438a6bed589faf49ab54e8e"} Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.670356 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666cf554b5-49snv" event={"ID":"cb8d09d5-0c47-4f41-b06f-8915f1dd0676","Type":"ContainerStarted","Data":"00221eafe79c6e1559b94d85f5ae71d09cd5241ff2901be48eced96abdb516dd"} Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.667057 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=28.620947257 podStartE2EDuration="36.667040174s" podCreationTimestamp="2025-10-01 13:23:10 +0000 UTC" firstStartedPulling="2025-10-01 13:23:22.10268897 +0000 UTC m=+1062.156673869" lastFinishedPulling="2025-10-01 13:23:30.148781887 +0000 UTC m=+1070.202766786" observedRunningTime="2025-10-01 13:23:46.66616846 +0000 UTC m=+1086.720153379" watchObservedRunningTime="2025-10-01 13:23:46.667040174 +0000 UTC m=+1086.721025073" Oct 01 13:23:46 crc kubenswrapper[4749]: I1001 13:23:46.695675 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=27.265092343 podStartE2EDuration="35.695656696s" podCreationTimestamp="2025-10-01 13:23:11 +0000 UTC" firstStartedPulling="2025-10-01 13:23:22.0902606 +0000 UTC m=+1062.144245499" lastFinishedPulling="2025-10-01 13:23:30.520824933 +0000 UTC m=+1070.574809852" observedRunningTime="2025-10-01 13:23:46.686899547 +0000 UTC m=+1086.740884446" watchObservedRunningTime="2025-10-01 13:23:46.695656696 +0000 UTC m=+1086.749641595" Oct 01 13:23:47 crc kubenswrapper[4749]: I1001 13:23:47.044199 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:47 crc kubenswrapper[4749]: E1001 13:23:47.044395 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 13:23:47 crc kubenswrapper[4749]: E1001 13:23:47.044691 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 13:23:47 crc kubenswrapper[4749]: E1001 13:23:47.044755 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift podName:6ef9f56d-2299-424f-9cc3-21cd7fcae8c1 nodeName:}" failed. No retries permitted until 2025-10-01 13:23:48.044736026 +0000 UTC m=+1088.098720925 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift") pod "swift-storage-0" (UID: "6ef9f56d-2299-424f-9cc3-21cd7fcae8c1") : configmap "swift-ring-files" not found Oct 01 13:23:47 crc kubenswrapper[4749]: I1001 13:23:47.680366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666cf554b5-49snv" event={"ID":"cb8d09d5-0c47-4f41-b06f-8915f1dd0676","Type":"ContainerStarted","Data":"25a6a0400009b888b328975d5bfb4b2469500bd325c7b8a664c2487cf5afe1da"} Oct 01 13:23:47 crc kubenswrapper[4749]: I1001 13:23:47.680656 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:47 crc kubenswrapper[4749]: I1001 13:23:47.683745 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"78021fea-966d-45d2-8816-265437360e8f","Type":"ContainerStarted","Data":"6c5ab8e7c48b8908cb180e8fd985d03f0fbca686f8c68aa42ceee84611fb3043"} Oct 01 13:23:47 crc kubenswrapper[4749]: I1001 13:23:47.683771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"78021fea-966d-45d2-8816-265437360e8f","Type":"ContainerStarted","Data":"571bc94844e4e1c6b0cce1ef2aba3b7dd813d8f65aff537402789351b1d01b90"} Oct 01 13:23:47 crc kubenswrapper[4749]: I1001 13:23:47.684191 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 01 13:23:47 crc kubenswrapper[4749]: I1001 13:23:47.708870 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666cf554b5-49snv" podStartSLOduration=2.708851445 podStartE2EDuration="2.708851445s" podCreationTimestamp="2025-10-01 13:23:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:23:47.703970562 +0000 UTC m=+1087.757955471" watchObservedRunningTime="2025-10-01 13:23:47.708851445 +0000 UTC m=+1087.762836334" Oct 01 13:23:47 crc kubenswrapper[4749]: I1001 13:23:47.728710 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.843585049 podStartE2EDuration="3.728694538s" podCreationTimestamp="2025-10-01 13:23:44 +0000 UTC" firstStartedPulling="2025-10-01 13:23:45.77002698 +0000 UTC m=+1085.824011879" lastFinishedPulling="2025-10-01 13:23:46.655136469 +0000 UTC m=+1086.709121368" observedRunningTime="2025-10-01 13:23:47.723760333 +0000 UTC m=+1087.777745232" watchObservedRunningTime="2025-10-01 13:23:47.728694538 +0000 UTC m=+1087.782679427" Oct 01 13:23:48 crc kubenswrapper[4749]: I1001 13:23:48.060582 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:48 crc kubenswrapper[4749]: E1001 13:23:48.060806 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 13:23:48 crc kubenswrapper[4749]: E1001 13:23:48.060832 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 13:23:48 crc kubenswrapper[4749]: E1001 13:23:48.060894 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift podName:6ef9f56d-2299-424f-9cc3-21cd7fcae8c1 nodeName:}" failed. No retries permitted until 2025-10-01 13:23:50.060874986 +0000 UTC m=+1090.114859895 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift") pod "swift-storage-0" (UID: "6ef9f56d-2299-424f-9cc3-21cd7fcae8c1") : configmap "swift-ring-files" not found Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.095067 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:50 crc kubenswrapper[4749]: E1001 13:23:50.095604 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 13:23:50 crc kubenswrapper[4749]: E1001 13:23:50.095807 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 13:23:50 crc kubenswrapper[4749]: E1001 13:23:50.095887 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift podName:6ef9f56d-2299-424f-9cc3-21cd7fcae8c1 nodeName:}" failed. No retries permitted until 2025-10-01 13:23:54.095864328 +0000 UTC m=+1094.149849237 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift") pod "swift-storage-0" (UID: "6ef9f56d-2299-424f-9cc3-21cd7fcae8c1") : configmap "swift-ring-files" not found Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.252474 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-h6gm9"] Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.254098 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.256120 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.256535 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.258537 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.298757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-dispersionconf\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.298820 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-swiftconf\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.298853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9csn\" (UniqueName: \"kubernetes.io/projected/c0804583-6f4e-48e5-99f5-eaee2844191d-kube-api-access-g9csn\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.298940 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c0804583-6f4e-48e5-99f5-eaee2844191d-ring-data-devices\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.299006 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0804583-6f4e-48e5-99f5-eaee2844191d-scripts\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.299036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c0804583-6f4e-48e5-99f5-eaee2844191d-etc-swift\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.299093 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-combined-ca-bundle\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.307905 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-h6gm9"] Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.400946 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c0804583-6f4e-48e5-99f5-eaee2844191d-ring-data-devices\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.401104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0804583-6f4e-48e5-99f5-eaee2844191d-scripts\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.401155 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c0804583-6f4e-48e5-99f5-eaee2844191d-etc-swift\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.401438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-combined-ca-bundle\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.401493 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-dispersionconf\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.401539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-swiftconf\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.401606 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9csn\" (UniqueName: \"kubernetes.io/projected/c0804583-6f4e-48e5-99f5-eaee2844191d-kube-api-access-g9csn\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.402683 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c0804583-6f4e-48e5-99f5-eaee2844191d-ring-data-devices\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.403026 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c0804583-6f4e-48e5-99f5-eaee2844191d-etc-swift\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.403076 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0804583-6f4e-48e5-99f5-eaee2844191d-scripts\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.407834 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-combined-ca-bundle\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.408599 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-swiftconf\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.409067 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-dispersionconf\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.421838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9csn\" (UniqueName: \"kubernetes.io/projected/c0804583-6f4e-48e5-99f5-eaee2844191d-kube-api-access-g9csn\") pod \"swift-ring-rebalance-h6gm9\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.575601 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:23:50 crc kubenswrapper[4749]: I1001 13:23:50.714876 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0620852a-daa1-4b0a-91ea-910dd2c379c9","Type":"ContainerStarted","Data":"7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161"} Oct 01 13:23:51 crc kubenswrapper[4749]: I1001 13:23:51.119496 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-h6gm9"] Oct 01 13:23:51 crc kubenswrapper[4749]: E1001 13:23:51.519627 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.220:41792->38.102.83.220:34693: write tcp 38.102.83.220:41792->38.102.83.220:34693: write: broken pipe Oct 01 13:23:51 crc kubenswrapper[4749]: I1001 13:23:51.725504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-h6gm9" event={"ID":"c0804583-6f4e-48e5-99f5-eaee2844191d","Type":"ContainerStarted","Data":"c6fcb564cacaf52eaa82b07c9e8704eae066de33c71469c497ce534e847b45f4"} Oct 01 13:23:52 crc kubenswrapper[4749]: I1001 13:23:52.210201 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 01 13:23:52 crc kubenswrapper[4749]: I1001 13:23:52.210294 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 01 13:23:52 crc kubenswrapper[4749]: I1001 13:23:52.305685 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 01 13:23:52 crc kubenswrapper[4749]: I1001 13:23:52.735523 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0620852a-daa1-4b0a-91ea-910dd2c379c9","Type":"ContainerStarted","Data":"e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247"} Oct 01 13:23:52 crc kubenswrapper[4749]: I1001 13:23:52.793464 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.087592 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-g8hvl"] Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.089117 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g8hvl" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.095252 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-g8hvl"] Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.149371 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcsgv\" (UniqueName: \"kubernetes.io/projected/061adc28-9cad-4d25-8245-906926bc7509-kube-api-access-lcsgv\") pod \"keystone-db-create-g8hvl\" (UID: \"061adc28-9cad-4d25-8245-906926bc7509\") " pod="openstack/keystone-db-create-g8hvl" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.189116 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.189170 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.251510 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcsgv\" (UniqueName: \"kubernetes.io/projected/061adc28-9cad-4d25-8245-906926bc7509-kube-api-access-lcsgv\") pod \"keystone-db-create-g8hvl\" (UID: \"061adc28-9cad-4d25-8245-906926bc7509\") " pod="openstack/keystone-db-create-g8hvl" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.262766 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.279999 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcsgv\" (UniqueName: \"kubernetes.io/projected/061adc28-9cad-4d25-8245-906926bc7509-kube-api-access-lcsgv\") pod \"keystone-db-create-g8hvl\" (UID: \"061adc28-9cad-4d25-8245-906926bc7509\") " pod="openstack/keystone-db-create-g8hvl" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.311919 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-z2mtw"] Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.313075 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z2mtw" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.330723 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-z2mtw"] Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.352845 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjrs9\" (UniqueName: \"kubernetes.io/projected/45fc3ae7-532e-4a89-91e7-7bcd8ff37a19-kube-api-access-pjrs9\") pod \"placement-db-create-z2mtw\" (UID: \"45fc3ae7-532e-4a89-91e7-7bcd8ff37a19\") " pod="openstack/placement-db-create-z2mtw" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.415625 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g8hvl" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.454058 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjrs9\" (UniqueName: \"kubernetes.io/projected/45fc3ae7-532e-4a89-91e7-7bcd8ff37a19-kube-api-access-pjrs9\") pod \"placement-db-create-z2mtw\" (UID: \"45fc3ae7-532e-4a89-91e7-7bcd8ff37a19\") " pod="openstack/placement-db-create-z2mtw" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.473004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjrs9\" (UniqueName: \"kubernetes.io/projected/45fc3ae7-532e-4a89-91e7-7bcd8ff37a19-kube-api-access-pjrs9\") pod \"placement-db-create-z2mtw\" (UID: \"45fc3ae7-532e-4a89-91e7-7bcd8ff37a19\") " pod="openstack/placement-db-create-z2mtw" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.644771 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z2mtw" Oct 01 13:23:53 crc kubenswrapper[4749]: I1001 13:23:53.816086 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 01 13:23:54 crc kubenswrapper[4749]: I1001 13:23:54.189527 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:23:54 crc kubenswrapper[4749]: E1001 13:23:54.189786 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 13:23:54 crc kubenswrapper[4749]: E1001 13:23:54.189959 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 13:23:54 crc kubenswrapper[4749]: E1001 13:23:54.190042 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift podName:6ef9f56d-2299-424f-9cc3-21cd7fcae8c1 nodeName:}" failed. No retries permitted until 2025-10-01 13:24:02.190015595 +0000 UTC m=+1102.244000534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift") pod "swift-storage-0" (UID: "6ef9f56d-2299-424f-9cc3-21cd7fcae8c1") : configmap "swift-ring-files" not found Oct 01 13:23:55 crc kubenswrapper[4749]: I1001 13:23:55.243730 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-bc6w2"] Oct 01 13:23:55 crc kubenswrapper[4749]: I1001 13:23:55.244826 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-bc6w2" Oct 01 13:23:55 crc kubenswrapper[4749]: I1001 13:23:55.258593 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-bc6w2"] Oct 01 13:23:55 crc kubenswrapper[4749]: I1001 13:23:55.314264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5nt2\" (UniqueName: \"kubernetes.io/projected/e6d72dd5-fa0f-48f9-9845-4dbccc6bd156-kube-api-access-b5nt2\") pod \"watcher-db-create-bc6w2\" (UID: \"e6d72dd5-fa0f-48f9-9845-4dbccc6bd156\") " pod="openstack/watcher-db-create-bc6w2" Oct 01 13:23:55 crc kubenswrapper[4749]: I1001 13:23:55.416005 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5nt2\" (UniqueName: \"kubernetes.io/projected/e6d72dd5-fa0f-48f9-9845-4dbccc6bd156-kube-api-access-b5nt2\") pod \"watcher-db-create-bc6w2\" (UID: \"e6d72dd5-fa0f-48f9-9845-4dbccc6bd156\") " pod="openstack/watcher-db-create-bc6w2" Oct 01 13:23:55 crc kubenswrapper[4749]: I1001 13:23:55.433303 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5nt2\" (UniqueName: \"kubernetes.io/projected/e6d72dd5-fa0f-48f9-9845-4dbccc6bd156-kube-api-access-b5nt2\") pod \"watcher-db-create-bc6w2\" (UID: \"e6d72dd5-fa0f-48f9-9845-4dbccc6bd156\") " pod="openstack/watcher-db-create-bc6w2" Oct 01 13:23:55 crc kubenswrapper[4749]: I1001 13:23:55.463434 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:23:55 crc kubenswrapper[4749]: I1001 13:23:55.527924 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5446c9d685-t66rg"] Oct 01 13:23:55 crc kubenswrapper[4749]: I1001 13:23:55.530944 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" podUID="6cd3a926-2d20-442c-8c9f-8b1641848284" containerName="dnsmasq-dns" containerID="cri-o://f03b3ee0b148ce078a7b2628173db46df1b8829f714e79cbf5a2fcdc63e4c2f1" gracePeriod=10 Oct 01 13:23:55 crc kubenswrapper[4749]: I1001 13:23:55.566965 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-bc6w2" Oct 01 13:23:56 crc kubenswrapper[4749]: I1001 13:23:56.962407 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" podUID="6cd3a926-2d20-442c-8c9f-8b1641848284" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Oct 01 13:23:57 crc kubenswrapper[4749]: I1001 13:23:57.778188 4749 generic.go:334] "Generic (PLEG): container finished" podID="6cd3a926-2d20-442c-8c9f-8b1641848284" containerID="f03b3ee0b148ce078a7b2628173db46df1b8829f714e79cbf5a2fcdc63e4c2f1" exitCode=0 Oct 01 13:23:57 crc kubenswrapper[4749]: I1001 13:23:57.778258 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" event={"ID":"6cd3a926-2d20-442c-8c9f-8b1641848284","Type":"ContainerDied","Data":"f03b3ee0b148ce078a7b2628173db46df1b8829f714e79cbf5a2fcdc63e4c2f1"} Oct 01 13:23:58 crc kubenswrapper[4749]: I1001 13:23:58.622141 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-jrmhv"] Oct 01 13:23:58 crc kubenswrapper[4749]: I1001 13:23:58.626051 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrmhv" Oct 01 13:23:58 crc kubenswrapper[4749]: I1001 13:23:58.640151 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jrmhv"] Oct 01 13:23:58 crc kubenswrapper[4749]: I1001 13:23:58.689491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq64p\" (UniqueName: \"kubernetes.io/projected/c596b48d-fdb6-4b00-96cb-a276d563900f-kube-api-access-rq64p\") pod \"glance-db-create-jrmhv\" (UID: \"c596b48d-fdb6-4b00-96cb-a276d563900f\") " pod="openstack/glance-db-create-jrmhv" Oct 01 13:23:58 crc kubenswrapper[4749]: I1001 13:23:58.791329 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq64p\" (UniqueName: \"kubernetes.io/projected/c596b48d-fdb6-4b00-96cb-a276d563900f-kube-api-access-rq64p\") pod \"glance-db-create-jrmhv\" (UID: \"c596b48d-fdb6-4b00-96cb-a276d563900f\") " pod="openstack/glance-db-create-jrmhv" Oct 01 13:23:58 crc kubenswrapper[4749]: I1001 13:23:58.814610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq64p\" (UniqueName: \"kubernetes.io/projected/c596b48d-fdb6-4b00-96cb-a276d563900f-kube-api-access-rq64p\") pod \"glance-db-create-jrmhv\" (UID: \"c596b48d-fdb6-4b00-96cb-a276d563900f\") " pod="openstack/glance-db-create-jrmhv" Oct 01 13:23:58 crc kubenswrapper[4749]: I1001 13:23:58.972820 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrmhv" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.597297 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.663188 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-bc6w2"] Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.707304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-dns-svc\") pod \"6cd3a926-2d20-442c-8c9f-8b1641848284\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.707355 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlxkq\" (UniqueName: \"kubernetes.io/projected/6cd3a926-2d20-442c-8c9f-8b1641848284-kube-api-access-vlxkq\") pod \"6cd3a926-2d20-442c-8c9f-8b1641848284\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.707403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-config\") pod \"6cd3a926-2d20-442c-8c9f-8b1641848284\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.707418 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-ovsdbserver-sb\") pod \"6cd3a926-2d20-442c-8c9f-8b1641848284\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.707436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-ovsdbserver-nb\") pod \"6cd3a926-2d20-442c-8c9f-8b1641848284\" (UID: \"6cd3a926-2d20-442c-8c9f-8b1641848284\") " Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.713483 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd3a926-2d20-442c-8c9f-8b1641848284-kube-api-access-vlxkq" (OuterVolumeSpecName: "kube-api-access-vlxkq") pod "6cd3a926-2d20-442c-8c9f-8b1641848284" (UID: "6cd3a926-2d20-442c-8c9f-8b1641848284"). InnerVolumeSpecName "kube-api-access-vlxkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.783949 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-g8hvl"] Oct 01 13:23:59 crc kubenswrapper[4749]: W1001 13:23:59.791387 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod061adc28_9cad_4d25_8245_906926bc7509.slice/crio-d31ed6c574fba31d4edf62b0cd77e3d33f82576f6ba4918dda8cc4afec930128 WatchSource:0}: Error finding container d31ed6c574fba31d4edf62b0cd77e3d33f82576f6ba4918dda8cc4afec930128: Status 404 returned error can't find the container with id d31ed6c574fba31d4edf62b0cd77e3d33f82576f6ba4918dda8cc4afec930128 Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.792374 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-z2mtw"] Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.794252 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" event={"ID":"6cd3a926-2d20-442c-8c9f-8b1641848284","Type":"ContainerDied","Data":"95942a3f75d2e4964e3d3e7aa42bd92ce1f6356059ec7d66b6756e9aaf1dd882"} Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.794291 4749 scope.go:117] "RemoveContainer" containerID="f03b3ee0b148ce078a7b2628173db46df1b8829f714e79cbf5a2fcdc63e4c2f1" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.794479 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5446c9d685-t66rg" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.796234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-bc6w2" event={"ID":"e6d72dd5-fa0f-48f9-9845-4dbccc6bd156","Type":"ContainerStarted","Data":"625d7f0ad3b0d3dd81e00990ab5c8171859f9da9cad79ab824d3f3cb48f1a480"} Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.799602 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-jrmhv"] Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.809753 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlxkq\" (UniqueName: \"kubernetes.io/projected/6cd3a926-2d20-442c-8c9f-8b1641848284-kube-api-access-vlxkq\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.811422 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-config" (OuterVolumeSpecName: "config") pod "6cd3a926-2d20-442c-8c9f-8b1641848284" (UID: "6cd3a926-2d20-442c-8c9f-8b1641848284"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.814553 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6cd3a926-2d20-442c-8c9f-8b1641848284" (UID: "6cd3a926-2d20-442c-8c9f-8b1641848284"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.825372 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6cd3a926-2d20-442c-8c9f-8b1641848284" (UID: "6cd3a926-2d20-442c-8c9f-8b1641848284"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.827190 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6cd3a926-2d20-442c-8c9f-8b1641848284" (UID: "6cd3a926-2d20-442c-8c9f-8b1641848284"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.827327 4749 scope.go:117] "RemoveContainer" containerID="a4fbc9f6e3df906bf71449cb2a2989385add840041052eb23c1441ee25ad97dd" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.911650 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.911677 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.911687 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:23:59 crc kubenswrapper[4749]: I1001 13:23:59.911695 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6cd3a926-2d20-442c-8c9f-8b1641848284-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.220809 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5446c9d685-t66rg"] Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.228129 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.237979 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5446c9d685-t66rg"] Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.804658 4749 generic.go:334] "Generic (PLEG): container finished" podID="e6d72dd5-fa0f-48f9-9845-4dbccc6bd156" containerID="5568a633dbc33307256d1285560ced24873c6788d15fe1702c024bdaa886400e" exitCode=0 Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.804706 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-bc6w2" event={"ID":"e6d72dd5-fa0f-48f9-9845-4dbccc6bd156","Type":"ContainerDied","Data":"5568a633dbc33307256d1285560ced24873c6788d15fe1702c024bdaa886400e"} Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.806533 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-h6gm9" event={"ID":"c0804583-6f4e-48e5-99f5-eaee2844191d","Type":"ContainerStarted","Data":"f395201c5533b9d8778d268946778fa63281e617cd2ec654b9bb70fe7e7caaac"} Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.809002 4749 generic.go:334] "Generic (PLEG): container finished" podID="c596b48d-fdb6-4b00-96cb-a276d563900f" containerID="b13f73df7466a8e5676c158f45a06cb7dac5a47ecd4a682620d91d56ebe4f727" exitCode=0 Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.809059 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrmhv" event={"ID":"c596b48d-fdb6-4b00-96cb-a276d563900f","Type":"ContainerDied","Data":"b13f73df7466a8e5676c158f45a06cb7dac5a47ecd4a682620d91d56ebe4f727"} Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.809085 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrmhv" event={"ID":"c596b48d-fdb6-4b00-96cb-a276d563900f","Type":"ContainerStarted","Data":"3f0b92751df4269316b17f80e1f007890a3f7fd335030474e93f7c09e232be4c"} Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.810863 4749 generic.go:334] "Generic (PLEG): container finished" podID="061adc28-9cad-4d25-8245-906926bc7509" containerID="dfe15272485bedc38157dcb6fc75040d770b1171f9480c598a69acc81d118e80" exitCode=0 Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.810923 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g8hvl" event={"ID":"061adc28-9cad-4d25-8245-906926bc7509","Type":"ContainerDied","Data":"dfe15272485bedc38157dcb6fc75040d770b1171f9480c598a69acc81d118e80"} Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.810951 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g8hvl" event={"ID":"061adc28-9cad-4d25-8245-906926bc7509","Type":"ContainerStarted","Data":"d31ed6c574fba31d4edf62b0cd77e3d33f82576f6ba4918dda8cc4afec930128"} Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.812445 4749 generic.go:334] "Generic (PLEG): container finished" podID="45fc3ae7-532e-4a89-91e7-7bcd8ff37a19" containerID="3a80bdbcf49494f38dabca41e55665eb196d8b62b317fce61237d90d3c4ab196" exitCode=0 Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.812488 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-z2mtw" event={"ID":"45fc3ae7-532e-4a89-91e7-7bcd8ff37a19","Type":"ContainerDied","Data":"3a80bdbcf49494f38dabca41e55665eb196d8b62b317fce61237d90d3c4ab196"} Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.812519 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-z2mtw" event={"ID":"45fc3ae7-532e-4a89-91e7-7bcd8ff37a19","Type":"ContainerStarted","Data":"ba00e8c9df3dac1763de6986604fa989ec041185e562adb7c4d25788524b7718"} Oct 01 13:24:00 crc kubenswrapper[4749]: I1001 13:24:00.886723 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-h6gm9" podStartSLOduration=2.419312875 podStartE2EDuration="10.886706274s" podCreationTimestamp="2025-10-01 13:23:50 +0000 UTC" firstStartedPulling="2025-10-01 13:23:51.1226838 +0000 UTC m=+1091.176668709" lastFinishedPulling="2025-10-01 13:23:59.590077209 +0000 UTC m=+1099.644062108" observedRunningTime="2025-10-01 13:24:00.884590707 +0000 UTC m=+1100.938575606" watchObservedRunningTime="2025-10-01 13:24:00.886706274 +0000 UTC m=+1100.940691173" Oct 01 13:24:01 crc kubenswrapper[4749]: I1001 13:24:01.247476 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd3a926-2d20-442c-8c9f-8b1641848284" path="/var/lib/kubelet/pods/6cd3a926-2d20-442c-8c9f-8b1641848284/volumes" Oct 01 13:24:02 crc kubenswrapper[4749]: I1001 13:24:02.106586 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:24:02 crc kubenswrapper[4749]: I1001 13:24:02.106631 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:24:02 crc kubenswrapper[4749]: I1001 13:24:02.255766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:24:02 crc kubenswrapper[4749]: E1001 13:24:02.255976 4749 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 13:24:02 crc kubenswrapper[4749]: E1001 13:24:02.256241 4749 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 13:24:02 crc kubenswrapper[4749]: E1001 13:24:02.256312 4749 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift podName:6ef9f56d-2299-424f-9cc3-21cd7fcae8c1 nodeName:}" failed. No retries permitted until 2025-10-01 13:24:18.256293463 +0000 UTC m=+1118.310278362 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift") pod "swift-storage-0" (UID: "6ef9f56d-2299-424f-9cc3-21cd7fcae8c1") : configmap "swift-ring-files" not found Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.027633 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-bc6w2" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.046081 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g8hvl" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.068315 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcsgv\" (UniqueName: \"kubernetes.io/projected/061adc28-9cad-4d25-8245-906926bc7509-kube-api-access-lcsgv\") pod \"061adc28-9cad-4d25-8245-906926bc7509\" (UID: \"061adc28-9cad-4d25-8245-906926bc7509\") " Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.068407 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5nt2\" (UniqueName: \"kubernetes.io/projected/e6d72dd5-fa0f-48f9-9845-4dbccc6bd156-kube-api-access-b5nt2\") pod \"e6d72dd5-fa0f-48f9-9845-4dbccc6bd156\" (UID: \"e6d72dd5-fa0f-48f9-9845-4dbccc6bd156\") " Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.092711 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d72dd5-fa0f-48f9-9845-4dbccc6bd156-kube-api-access-b5nt2" (OuterVolumeSpecName: "kube-api-access-b5nt2") pod "e6d72dd5-fa0f-48f9-9845-4dbccc6bd156" (UID: "e6d72dd5-fa0f-48f9-9845-4dbccc6bd156"). InnerVolumeSpecName "kube-api-access-b5nt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.099518 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrmhv" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.104838 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/061adc28-9cad-4d25-8245-906926bc7509-kube-api-access-lcsgv" (OuterVolumeSpecName: "kube-api-access-lcsgv") pod "061adc28-9cad-4d25-8245-906926bc7509" (UID: "061adc28-9cad-4d25-8245-906926bc7509"). InnerVolumeSpecName "kube-api-access-lcsgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.142064 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z2mtw" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.170165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq64p\" (UniqueName: \"kubernetes.io/projected/c596b48d-fdb6-4b00-96cb-a276d563900f-kube-api-access-rq64p\") pod \"c596b48d-fdb6-4b00-96cb-a276d563900f\" (UID: \"c596b48d-fdb6-4b00-96cb-a276d563900f\") " Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.170403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjrs9\" (UniqueName: \"kubernetes.io/projected/45fc3ae7-532e-4a89-91e7-7bcd8ff37a19-kube-api-access-pjrs9\") pod \"45fc3ae7-532e-4a89-91e7-7bcd8ff37a19\" (UID: \"45fc3ae7-532e-4a89-91e7-7bcd8ff37a19\") " Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.170848 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcsgv\" (UniqueName: \"kubernetes.io/projected/061adc28-9cad-4d25-8245-906926bc7509-kube-api-access-lcsgv\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.170871 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5nt2\" (UniqueName: \"kubernetes.io/projected/e6d72dd5-fa0f-48f9-9845-4dbccc6bd156-kube-api-access-b5nt2\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.173376 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c596b48d-fdb6-4b00-96cb-a276d563900f-kube-api-access-rq64p" (OuterVolumeSpecName: "kube-api-access-rq64p") pod "c596b48d-fdb6-4b00-96cb-a276d563900f" (UID: "c596b48d-fdb6-4b00-96cb-a276d563900f"). InnerVolumeSpecName "kube-api-access-rq64p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.175586 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45fc3ae7-532e-4a89-91e7-7bcd8ff37a19-kube-api-access-pjrs9" (OuterVolumeSpecName: "kube-api-access-pjrs9") pod "45fc3ae7-532e-4a89-91e7-7bcd8ff37a19" (UID: "45fc3ae7-532e-4a89-91e7-7bcd8ff37a19"). InnerVolumeSpecName "kube-api-access-pjrs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.283595 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq64p\" (UniqueName: \"kubernetes.io/projected/c596b48d-fdb6-4b00-96cb-a276d563900f-kube-api-access-rq64p\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.283635 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjrs9\" (UniqueName: \"kubernetes.io/projected/45fc3ae7-532e-4a89-91e7-7bcd8ff37a19-kube-api-access-pjrs9\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.836093 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g8hvl" event={"ID":"061adc28-9cad-4d25-8245-906926bc7509","Type":"ContainerDied","Data":"d31ed6c574fba31d4edf62b0cd77e3d33f82576f6ba4918dda8cc4afec930128"} Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.836431 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d31ed6c574fba31d4edf62b0cd77e3d33f82576f6ba4918dda8cc4afec930128" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.836350 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g8hvl" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.840790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-z2mtw" event={"ID":"45fc3ae7-532e-4a89-91e7-7bcd8ff37a19","Type":"ContainerDied","Data":"ba00e8c9df3dac1763de6986604fa989ec041185e562adb7c4d25788524b7718"} Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.840830 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba00e8c9df3dac1763de6986604fa989ec041185e562adb7c4d25788524b7718" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.840879 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-z2mtw" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.844316 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-bc6w2" event={"ID":"e6d72dd5-fa0f-48f9-9845-4dbccc6bd156","Type":"ContainerDied","Data":"625d7f0ad3b0d3dd81e00990ab5c8171859f9da9cad79ab824d3f3cb48f1a480"} Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.844353 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="625d7f0ad3b0d3dd81e00990ab5c8171859f9da9cad79ab824d3f3cb48f1a480" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.844335 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-bc6w2" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.856624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-jrmhv" event={"ID":"c596b48d-fdb6-4b00-96cb-a276d563900f","Type":"ContainerDied","Data":"3f0b92751df4269316b17f80e1f007890a3f7fd335030474e93f7c09e232be4c"} Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.856668 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f0b92751df4269316b17f80e1f007890a3f7fd335030474e93f7c09e232be4c" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.856743 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-jrmhv" Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.872281 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0620852a-daa1-4b0a-91ea-910dd2c379c9","Type":"ContainerStarted","Data":"860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a"} Oct 01 13:24:03 crc kubenswrapper[4749]: I1001 13:24:03.895933 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=7.690756691 podStartE2EDuration="48.895918901s" podCreationTimestamp="2025-10-01 13:23:15 +0000 UTC" firstStartedPulling="2025-10-01 13:23:22.410586014 +0000 UTC m=+1062.464570913" lastFinishedPulling="2025-10-01 13:24:03.615748224 +0000 UTC m=+1103.669733123" observedRunningTime="2025-10-01 13:24:03.895265473 +0000 UTC m=+1103.949250392" watchObservedRunningTime="2025-10-01 13:24:03.895918901 +0000 UTC m=+1103.949903790" Oct 01 13:24:04 crc kubenswrapper[4749]: I1001 13:24:04.868074 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-sl4xv" podUID="4035d0d3-eeec-429f-b31e-ab4649ecf92a" containerName="ovn-controller" probeResult="failure" output=< Oct 01 13:24:04 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 13:24:04 crc kubenswrapper[4749]: > Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.422751 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-46df-account-create-lm8pz"] Oct 01 13:24:05 crc kubenswrapper[4749]: E1001 13:24:05.423101 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061adc28-9cad-4d25-8245-906926bc7509" containerName="mariadb-database-create" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.423119 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="061adc28-9cad-4d25-8245-906926bc7509" containerName="mariadb-database-create" Oct 01 13:24:05 crc kubenswrapper[4749]: E1001 13:24:05.423138 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd3a926-2d20-442c-8c9f-8b1641848284" containerName="init" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.423147 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd3a926-2d20-442c-8c9f-8b1641848284" containerName="init" Oct 01 13:24:05 crc kubenswrapper[4749]: E1001 13:24:05.423162 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fc3ae7-532e-4a89-91e7-7bcd8ff37a19" containerName="mariadb-database-create" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.423171 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fc3ae7-532e-4a89-91e7-7bcd8ff37a19" containerName="mariadb-database-create" Oct 01 13:24:05 crc kubenswrapper[4749]: E1001 13:24:05.423189 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c596b48d-fdb6-4b00-96cb-a276d563900f" containerName="mariadb-database-create" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.423196 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c596b48d-fdb6-4b00-96cb-a276d563900f" containerName="mariadb-database-create" Oct 01 13:24:05 crc kubenswrapper[4749]: E1001 13:24:05.423212 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d72dd5-fa0f-48f9-9845-4dbccc6bd156" containerName="mariadb-database-create" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.423220 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d72dd5-fa0f-48f9-9845-4dbccc6bd156" containerName="mariadb-database-create" Oct 01 13:24:05 crc kubenswrapper[4749]: E1001 13:24:05.423238 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd3a926-2d20-442c-8c9f-8b1641848284" containerName="dnsmasq-dns" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.423263 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd3a926-2d20-442c-8c9f-8b1641848284" containerName="dnsmasq-dns" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.423512 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd3a926-2d20-442c-8c9f-8b1641848284" containerName="dnsmasq-dns" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.423532 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="061adc28-9cad-4d25-8245-906926bc7509" containerName="mariadb-database-create" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.423548 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fc3ae7-532e-4a89-91e7-7bcd8ff37a19" containerName="mariadb-database-create" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.423562 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c596b48d-fdb6-4b00-96cb-a276d563900f" containerName="mariadb-database-create" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.423572 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d72dd5-fa0f-48f9-9845-4dbccc6bd156" containerName="mariadb-database-create" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.424184 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-46df-account-create-lm8pz" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.427405 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.435839 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-46df-account-create-lm8pz"] Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.535570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w8wv\" (UniqueName: \"kubernetes.io/projected/a108a313-0fd8-41bc-9fbf-a86a003fe066-kube-api-access-8w8wv\") pod \"watcher-46df-account-create-lm8pz\" (UID: \"a108a313-0fd8-41bc-9fbf-a86a003fe066\") " pod="openstack/watcher-46df-account-create-lm8pz" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.637387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w8wv\" (UniqueName: \"kubernetes.io/projected/a108a313-0fd8-41bc-9fbf-a86a003fe066-kube-api-access-8w8wv\") pod \"watcher-46df-account-create-lm8pz\" (UID: \"a108a313-0fd8-41bc-9fbf-a86a003fe066\") " pod="openstack/watcher-46df-account-create-lm8pz" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.662960 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w8wv\" (UniqueName: \"kubernetes.io/projected/a108a313-0fd8-41bc-9fbf-a86a003fe066-kube-api-access-8w8wv\") pod \"watcher-46df-account-create-lm8pz\" (UID: \"a108a313-0fd8-41bc-9fbf-a86a003fe066\") " pod="openstack/watcher-46df-account-create-lm8pz" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.746862 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-46df-account-create-lm8pz" Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.894743 4749 generic.go:334] "Generic (PLEG): container finished" podID="89621c5c-1d46-44be-852f-1a37dccf02e9" containerID="6432798b0d8b723cf7c554daa2ac158cb3aa44a8340879d92c784631824689c2" exitCode=0 Oct 01 13:24:05 crc kubenswrapper[4749]: I1001 13:24:05.895038 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89621c5c-1d46-44be-852f-1a37dccf02e9","Type":"ContainerDied","Data":"6432798b0d8b723cf7c554daa2ac158cb3aa44a8340879d92c784631824689c2"} Oct 01 13:24:06 crc kubenswrapper[4749]: I1001 13:24:06.220526 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-46df-account-create-lm8pz"] Oct 01 13:24:06 crc kubenswrapper[4749]: I1001 13:24:06.513094 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:06 crc kubenswrapper[4749]: I1001 13:24:06.921541 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89621c5c-1d46-44be-852f-1a37dccf02e9","Type":"ContainerStarted","Data":"c42ac4cb417747efda2345fa343f1c6c20a214153c3b3843b66e4e747ae8e5f7"} Oct 01 13:24:06 crc kubenswrapper[4749]: I1001 13:24:06.922292 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 13:24:06 crc kubenswrapper[4749]: I1001 13:24:06.928356 4749 generic.go:334] "Generic (PLEG): container finished" podID="a108a313-0fd8-41bc-9fbf-a86a003fe066" containerID="2f617727b338da067d00d361aabcf7196a049f8e26cbda97fea680dd621faca7" exitCode=0 Oct 01 13:24:06 crc kubenswrapper[4749]: I1001 13:24:06.928411 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-46df-account-create-lm8pz" event={"ID":"a108a313-0fd8-41bc-9fbf-a86a003fe066","Type":"ContainerDied","Data":"2f617727b338da067d00d361aabcf7196a049f8e26cbda97fea680dd621faca7"} Oct 01 13:24:06 crc kubenswrapper[4749]: I1001 13:24:06.928447 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-46df-account-create-lm8pz" event={"ID":"a108a313-0fd8-41bc-9fbf-a86a003fe066","Type":"ContainerStarted","Data":"1378b80c5f07b35009bebccc3b1d03ac48431584b67a690453a3390be024b37a"} Oct 01 13:24:06 crc kubenswrapper[4749]: I1001 13:24:06.953434 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.061206916 podStartE2EDuration="58.953415977s" podCreationTimestamp="2025-10-01 13:23:08 +0000 UTC" firstStartedPulling="2025-10-01 13:23:21.840524935 +0000 UTC m=+1061.894509834" lastFinishedPulling="2025-10-01 13:23:29.732733986 +0000 UTC m=+1069.786718895" observedRunningTime="2025-10-01 13:24:06.949044278 +0000 UTC m=+1107.003029177" watchObservedRunningTime="2025-10-01 13:24:06.953415977 +0000 UTC m=+1107.007400886" Oct 01 13:24:07 crc kubenswrapper[4749]: I1001 13:24:07.937512 4749 generic.go:334] "Generic (PLEG): container finished" podID="35e1759f-e27a-4891-9fc0-37753b25689d" containerID="be178b2e3c636505e7559b1f4f4b2b933f257fb8e1218e28feced6d711498217" exitCode=0 Oct 01 13:24:07 crc kubenswrapper[4749]: I1001 13:24:07.937594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35e1759f-e27a-4891-9fc0-37753b25689d","Type":"ContainerDied","Data":"be178b2e3c636505e7559b1f4f4b2b933f257fb8e1218e28feced6d711498217"} Oct 01 13:24:07 crc kubenswrapper[4749]: I1001 13:24:07.943485 4749 generic.go:334] "Generic (PLEG): container finished" podID="c0804583-6f4e-48e5-99f5-eaee2844191d" containerID="f395201c5533b9d8778d268946778fa63281e617cd2ec654b9bb70fe7e7caaac" exitCode=0 Oct 01 13:24:07 crc kubenswrapper[4749]: I1001 13:24:07.943533 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-h6gm9" event={"ID":"c0804583-6f4e-48e5-99f5-eaee2844191d","Type":"ContainerDied","Data":"f395201c5533b9d8778d268946778fa63281e617cd2ec654b9bb70fe7e7caaac"} Oct 01 13:24:07 crc kubenswrapper[4749]: I1001 13:24:07.945852 4749 generic.go:334] "Generic (PLEG): container finished" podID="1d655140-d63d-4e40-8de1-875213f37d4a" containerID="3da935e241d19c118277a2ebfd988ad64a4c330d0fc5f8743768b13613da5ec6" exitCode=0 Oct 01 13:24:07 crc kubenswrapper[4749]: I1001 13:24:07.946022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"1d655140-d63d-4e40-8de1-875213f37d4a","Type":"ContainerDied","Data":"3da935e241d19c118277a2ebfd988ad64a4c330d0fc5f8743768b13613da5ec6"} Oct 01 13:24:08 crc kubenswrapper[4749]: E1001 13:24:08.001094 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0804583_6f4e_48e5_99f5_eaee2844191d.slice/crio-f395201c5533b9d8778d268946778fa63281e617cd2ec654b9bb70fe7e7caaac.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:24:08 crc kubenswrapper[4749]: I1001 13:24:08.278170 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-46df-account-create-lm8pz" Oct 01 13:24:08 crc kubenswrapper[4749]: I1001 13:24:08.395279 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w8wv\" (UniqueName: \"kubernetes.io/projected/a108a313-0fd8-41bc-9fbf-a86a003fe066-kube-api-access-8w8wv\") pod \"a108a313-0fd8-41bc-9fbf-a86a003fe066\" (UID: \"a108a313-0fd8-41bc-9fbf-a86a003fe066\") " Oct 01 13:24:08 crc kubenswrapper[4749]: I1001 13:24:08.401595 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a108a313-0fd8-41bc-9fbf-a86a003fe066-kube-api-access-8w8wv" (OuterVolumeSpecName: "kube-api-access-8w8wv") pod "a108a313-0fd8-41bc-9fbf-a86a003fe066" (UID: "a108a313-0fd8-41bc-9fbf-a86a003fe066"). InnerVolumeSpecName "kube-api-access-8w8wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:08 crc kubenswrapper[4749]: I1001 13:24:08.497441 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8w8wv\" (UniqueName: \"kubernetes.io/projected/a108a313-0fd8-41bc-9fbf-a86a003fe066-kube-api-access-8w8wv\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:08 crc kubenswrapper[4749]: I1001 13:24:08.956441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35e1759f-e27a-4891-9fc0-37753b25689d","Type":"ContainerStarted","Data":"8499483ffc6abaf600fe62853a84bef67de5586a9a9977d0c6e190fce38d33da"} Oct 01 13:24:08 crc kubenswrapper[4749]: I1001 13:24:08.956665 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:24:08 crc kubenswrapper[4749]: I1001 13:24:08.958933 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-46df-account-create-lm8pz" event={"ID":"a108a313-0fd8-41bc-9fbf-a86a003fe066","Type":"ContainerDied","Data":"1378b80c5f07b35009bebccc3b1d03ac48431584b67a690453a3390be024b37a"} Oct 01 13:24:08 crc kubenswrapper[4749]: I1001 13:24:08.958967 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1378b80c5f07b35009bebccc3b1d03ac48431584b67a690453a3390be024b37a" Oct 01 13:24:08 crc kubenswrapper[4749]: I1001 13:24:08.959008 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-46df-account-create-lm8pz" Oct 01 13:24:08 crc kubenswrapper[4749]: I1001 13:24:08.967645 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"1d655140-d63d-4e40-8de1-875213f37d4a","Type":"ContainerStarted","Data":"d3c92f39dcb23d45fb4f55637a61cbb417801d04e8534bef55e21633d0aa5d8b"} Oct 01 13:24:08 crc kubenswrapper[4749]: I1001 13:24:08.968365 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.029602 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.397780684 podStartE2EDuration="1m2.029581646s" podCreationTimestamp="2025-10-01 13:23:07 +0000 UTC" firstStartedPulling="2025-10-01 13:23:21.874327679 +0000 UTC m=+1061.928312578" lastFinishedPulling="2025-10-01 13:23:30.506128641 +0000 UTC m=+1070.560113540" observedRunningTime="2025-10-01 13:24:09.000658476 +0000 UTC m=+1109.054643385" watchObservedRunningTime="2025-10-01 13:24:09.029581646 +0000 UTC m=+1109.083566545" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.032843 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=52.35886065 podStartE2EDuration="1m1.032827695s" podCreationTimestamp="2025-10-01 13:23:08 +0000 UTC" firstStartedPulling="2025-10-01 13:23:21.848693898 +0000 UTC m=+1061.902678797" lastFinishedPulling="2025-10-01 13:23:30.522660943 +0000 UTC m=+1070.576645842" observedRunningTime="2025-10-01 13:24:09.026684717 +0000 UTC m=+1109.080669626" watchObservedRunningTime="2025-10-01 13:24:09.032827695 +0000 UTC m=+1109.086812594" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.318949 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.410770 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c0804583-6f4e-48e5-99f5-eaee2844191d-etc-swift\") pod \"c0804583-6f4e-48e5-99f5-eaee2844191d\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.411013 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c0804583-6f4e-48e5-99f5-eaee2844191d-ring-data-devices\") pod \"c0804583-6f4e-48e5-99f5-eaee2844191d\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.411185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-dispersionconf\") pod \"c0804583-6f4e-48e5-99f5-eaee2844191d\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.411247 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-combined-ca-bundle\") pod \"c0804583-6f4e-48e5-99f5-eaee2844191d\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.411294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0804583-6f4e-48e5-99f5-eaee2844191d-scripts\") pod \"c0804583-6f4e-48e5-99f5-eaee2844191d\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.411318 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-swiftconf\") pod \"c0804583-6f4e-48e5-99f5-eaee2844191d\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.411386 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9csn\" (UniqueName: \"kubernetes.io/projected/c0804583-6f4e-48e5-99f5-eaee2844191d-kube-api-access-g9csn\") pod \"c0804583-6f4e-48e5-99f5-eaee2844191d\" (UID: \"c0804583-6f4e-48e5-99f5-eaee2844191d\") " Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.411566 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0804583-6f4e-48e5-99f5-eaee2844191d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c0804583-6f4e-48e5-99f5-eaee2844191d" (UID: "c0804583-6f4e-48e5-99f5-eaee2844191d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.411876 4749 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c0804583-6f4e-48e5-99f5-eaee2844191d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.412941 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0804583-6f4e-48e5-99f5-eaee2844191d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c0804583-6f4e-48e5-99f5-eaee2844191d" (UID: "c0804583-6f4e-48e5-99f5-eaee2844191d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.427816 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0804583-6f4e-48e5-99f5-eaee2844191d-kube-api-access-g9csn" (OuterVolumeSpecName: "kube-api-access-g9csn") pod "c0804583-6f4e-48e5-99f5-eaee2844191d" (UID: "c0804583-6f4e-48e5-99f5-eaee2844191d"). InnerVolumeSpecName "kube-api-access-g9csn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.432397 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c0804583-6f4e-48e5-99f5-eaee2844191d" (UID: "c0804583-6f4e-48e5-99f5-eaee2844191d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.447069 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0804583-6f4e-48e5-99f5-eaee2844191d" (UID: "c0804583-6f4e-48e5-99f5-eaee2844191d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.450940 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0804583-6f4e-48e5-99f5-eaee2844191d-scripts" (OuterVolumeSpecName: "scripts") pod "c0804583-6f4e-48e5-99f5-eaee2844191d" (UID: "c0804583-6f4e-48e5-99f5-eaee2844191d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.456123 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c0804583-6f4e-48e5-99f5-eaee2844191d" (UID: "c0804583-6f4e-48e5-99f5-eaee2844191d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.513286 4749 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.513335 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.513360 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0804583-6f4e-48e5-99f5-eaee2844191d-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.513378 4749 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c0804583-6f4e-48e5-99f5-eaee2844191d-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.513396 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9csn\" (UniqueName: \"kubernetes.io/projected/c0804583-6f4e-48e5-99f5-eaee2844191d-kube-api-access-g9csn\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.513416 4749 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c0804583-6f4e-48e5-99f5-eaee2844191d-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.854319 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-sl4xv" podUID="4035d0d3-eeec-429f-b31e-ab4649ecf92a" containerName="ovn-controller" probeResult="failure" output=< Oct 01 13:24:09 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 13:24:09 crc kubenswrapper[4749]: > Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.978678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-h6gm9" event={"ID":"c0804583-6f4e-48e5-99f5-eaee2844191d","Type":"ContainerDied","Data":"c6fcb564cacaf52eaa82b07c9e8704eae066de33c71469c497ce534e847b45f4"} Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.978757 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6fcb564cacaf52eaa82b07c9e8704eae066de33c71469c497ce534e847b45f4" Oct 01 13:24:09 crc kubenswrapper[4749]: I1001 13:24:09.979123 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-h6gm9" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.132508 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b604-account-create-8v4pd"] Oct 01 13:24:13 crc kubenswrapper[4749]: E1001 13:24:13.133314 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a108a313-0fd8-41bc-9fbf-a86a003fe066" containerName="mariadb-account-create" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.133325 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a108a313-0fd8-41bc-9fbf-a86a003fe066" containerName="mariadb-account-create" Oct 01 13:24:13 crc kubenswrapper[4749]: E1001 13:24:13.133353 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0804583-6f4e-48e5-99f5-eaee2844191d" containerName="swift-ring-rebalance" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.133359 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0804583-6f4e-48e5-99f5-eaee2844191d" containerName="swift-ring-rebalance" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.133508 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0804583-6f4e-48e5-99f5-eaee2844191d" containerName="swift-ring-rebalance" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.133523 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a108a313-0fd8-41bc-9fbf-a86a003fe066" containerName="mariadb-account-create" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.134035 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b604-account-create-8v4pd" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.139083 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.154464 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b604-account-create-8v4pd"] Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.277144 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h9ld\" (UniqueName: \"kubernetes.io/projected/f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71-kube-api-access-9h9ld\") pod \"keystone-b604-account-create-8v4pd\" (UID: \"f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71\") " pod="openstack/keystone-b604-account-create-8v4pd" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.378605 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h9ld\" (UniqueName: \"kubernetes.io/projected/f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71-kube-api-access-9h9ld\") pod \"keystone-b604-account-create-8v4pd\" (UID: \"f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71\") " pod="openstack/keystone-b604-account-create-8v4pd" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.397857 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h9ld\" (UniqueName: \"kubernetes.io/projected/f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71-kube-api-access-9h9ld\") pod \"keystone-b604-account-create-8v4pd\" (UID: \"f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71\") " pod="openstack/keystone-b604-account-create-8v4pd" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.448362 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-687d-account-create-9ssdm"] Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.455032 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b604-account-create-8v4pd" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.471983 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687d-account-create-9ssdm" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.476298 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.491641 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-687d-account-create-9ssdm"] Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.621433 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pxqx\" (UniqueName: \"kubernetes.io/projected/85e090bf-b7d2-4f65-b8c0-147f61612834-kube-api-access-2pxqx\") pod \"placement-687d-account-create-9ssdm\" (UID: \"85e090bf-b7d2-4f65-b8c0-147f61612834\") " pod="openstack/placement-687d-account-create-9ssdm" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.727093 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pxqx\" (UniqueName: \"kubernetes.io/projected/85e090bf-b7d2-4f65-b8c0-147f61612834-kube-api-access-2pxqx\") pod \"placement-687d-account-create-9ssdm\" (UID: \"85e090bf-b7d2-4f65-b8c0-147f61612834\") " pod="openstack/placement-687d-account-create-9ssdm" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.747561 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pxqx\" (UniqueName: \"kubernetes.io/projected/85e090bf-b7d2-4f65-b8c0-147f61612834-kube-api-access-2pxqx\") pod \"placement-687d-account-create-9ssdm\" (UID: \"85e090bf-b7d2-4f65-b8c0-147f61612834\") " pod="openstack/placement-687d-account-create-9ssdm" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.792600 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687d-account-create-9ssdm" Oct 01 13:24:13 crc kubenswrapper[4749]: I1001 13:24:13.959926 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b604-account-create-8v4pd"] Oct 01 13:24:13 crc kubenswrapper[4749]: W1001 13:24:13.970477 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9ea6d66_83a3_4ab2_8e8c_8dbf8b861f71.slice/crio-4957d5c68be51059b2154b13c41c6b06cc7c9cc89bb57df029be534100a0d6bc WatchSource:0}: Error finding container 4957d5c68be51059b2154b13c41c6b06cc7c9cc89bb57df029be534100a0d6bc: Status 404 returned error can't find the container with id 4957d5c68be51059b2154b13c41c6b06cc7c9cc89bb57df029be534100a0d6bc Oct 01 13:24:14 crc kubenswrapper[4749]: I1001 13:24:14.016428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b604-account-create-8v4pd" event={"ID":"f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71","Type":"ContainerStarted","Data":"4957d5c68be51059b2154b13c41c6b06cc7c9cc89bb57df029be534100a0d6bc"} Oct 01 13:24:14 crc kubenswrapper[4749]: I1001 13:24:14.230070 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-687d-account-create-9ssdm"] Oct 01 13:24:14 crc kubenswrapper[4749]: W1001 13:24:14.261482 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85e090bf_b7d2_4f65_b8c0_147f61612834.slice/crio-2fe86bf8c29e60d1754b3cf887b0249181a50ae3938f86d9d8106c6bdfe28efc WatchSource:0}: Error finding container 2fe86bf8c29e60d1754b3cf887b0249181a50ae3938f86d9d8106c6bdfe28efc: Status 404 returned error can't find the container with id 2fe86bf8c29e60d1754b3cf887b0249181a50ae3938f86d9d8106c6bdfe28efc Oct 01 13:24:14 crc kubenswrapper[4749]: I1001 13:24:14.856378 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-sl4xv" podUID="4035d0d3-eeec-429f-b31e-ab4649ecf92a" containerName="ovn-controller" probeResult="failure" output=< Oct 01 13:24:14 crc kubenswrapper[4749]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 13:24:14 crc kubenswrapper[4749]: > Oct 01 13:24:14 crc kubenswrapper[4749]: I1001 13:24:14.979004 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:24:14 crc kubenswrapper[4749]: I1001 13:24:14.991289 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-75t94" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.027259 4749 generic.go:334] "Generic (PLEG): container finished" podID="f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71" containerID="15143a0978f9ddd502a12bb6b3e166c92a0b005fcfa606929bfc5f6f9f3753a5" exitCode=0 Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.027553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b604-account-create-8v4pd" event={"ID":"f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71","Type":"ContainerDied","Data":"15143a0978f9ddd502a12bb6b3e166c92a0b005fcfa606929bfc5f6f9f3753a5"} Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.033493 4749 generic.go:334] "Generic (PLEG): container finished" podID="85e090bf-b7d2-4f65-b8c0-147f61612834" containerID="dc75dd6f34881a4b422fb71e94f85f36204730c0de5864799d407d5ef016948a" exitCode=0 Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.034370 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687d-account-create-9ssdm" event={"ID":"85e090bf-b7d2-4f65-b8c0-147f61612834","Type":"ContainerDied","Data":"dc75dd6f34881a4b422fb71e94f85f36204730c0de5864799d407d5ef016948a"} Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.034404 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687d-account-create-9ssdm" event={"ID":"85e090bf-b7d2-4f65-b8c0-147f61612834","Type":"ContainerStarted","Data":"2fe86bf8c29e60d1754b3cf887b0249181a50ae3938f86d9d8106c6bdfe28efc"} Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.224496 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sl4xv-config-kq5d2"] Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.225558 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.228065 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.246670 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sl4xv-config-kq5d2"] Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.251759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c84137f7-d1b7-422d-b628-5d681da78cb2-additional-scripts\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.251905 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-run-ovn\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.251933 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-run\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.251954 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j49dw\" (UniqueName: \"kubernetes.io/projected/c84137f7-d1b7-422d-b628-5d681da78cb2-kube-api-access-j49dw\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.251981 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c84137f7-d1b7-422d-b628-5d681da78cb2-scripts\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.252007 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-log-ovn\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.353876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-log-ovn\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.353956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c84137f7-d1b7-422d-b628-5d681da78cb2-additional-scripts\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.354074 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-run-ovn\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.354109 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-run\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.354131 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j49dw\" (UniqueName: \"kubernetes.io/projected/c84137f7-d1b7-422d-b628-5d681da78cb2-kube-api-access-j49dw\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.354154 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c84137f7-d1b7-422d-b628-5d681da78cb2-scripts\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.354271 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-log-ovn\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.354305 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-run-ovn\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.354352 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-run\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.354821 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c84137f7-d1b7-422d-b628-5d681da78cb2-additional-scripts\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.356346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c84137f7-d1b7-422d-b628-5d681da78cb2-scripts\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.374003 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j49dw\" (UniqueName: \"kubernetes.io/projected/c84137f7-d1b7-422d-b628-5d681da78cb2-kube-api-access-j49dw\") pod \"ovn-controller-sl4xv-config-kq5d2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:15 crc kubenswrapper[4749]: I1001 13:24:15.540102 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:16 crc kubenswrapper[4749]: I1001 13:24:16.002832 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sl4xv-config-kq5d2"] Oct 01 13:24:16 crc kubenswrapper[4749]: I1001 13:24:16.051407 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sl4xv-config-kq5d2" event={"ID":"c84137f7-d1b7-422d-b628-5d681da78cb2","Type":"ContainerStarted","Data":"83f9b2697e229ffd98e91f83bbdca1df9e0413fdc6cdbd3523b4a15e5cc25537"} Oct 01 13:24:16 crc kubenswrapper[4749]: I1001 13:24:16.361209 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687d-account-create-9ssdm" Oct 01 13:24:16 crc kubenswrapper[4749]: I1001 13:24:16.368944 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pxqx\" (UniqueName: \"kubernetes.io/projected/85e090bf-b7d2-4f65-b8c0-147f61612834-kube-api-access-2pxqx\") pod \"85e090bf-b7d2-4f65-b8c0-147f61612834\" (UID: \"85e090bf-b7d2-4f65-b8c0-147f61612834\") " Oct 01 13:24:16 crc kubenswrapper[4749]: I1001 13:24:16.373832 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e090bf-b7d2-4f65-b8c0-147f61612834-kube-api-access-2pxqx" (OuterVolumeSpecName: "kube-api-access-2pxqx") pod "85e090bf-b7d2-4f65-b8c0-147f61612834" (UID: "85e090bf-b7d2-4f65-b8c0-147f61612834"). InnerVolumeSpecName "kube-api-access-2pxqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:16 crc kubenswrapper[4749]: I1001 13:24:16.403419 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b604-account-create-8v4pd" Oct 01 13:24:16 crc kubenswrapper[4749]: I1001 13:24:16.469419 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h9ld\" (UniqueName: \"kubernetes.io/projected/f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71-kube-api-access-9h9ld\") pod \"f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71\" (UID: \"f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71\") " Oct 01 13:24:16 crc kubenswrapper[4749]: I1001 13:24:16.469754 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pxqx\" (UniqueName: \"kubernetes.io/projected/85e090bf-b7d2-4f65-b8c0-147f61612834-kube-api-access-2pxqx\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:16 crc kubenswrapper[4749]: I1001 13:24:16.472349 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71-kube-api-access-9h9ld" (OuterVolumeSpecName: "kube-api-access-9h9ld") pod "f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71" (UID: "f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71"). InnerVolumeSpecName "kube-api-access-9h9ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:16 crc kubenswrapper[4749]: I1001 13:24:16.513695 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:16 crc kubenswrapper[4749]: I1001 13:24:16.515446 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:16 crc kubenswrapper[4749]: I1001 13:24:16.571025 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h9ld\" (UniqueName: \"kubernetes.io/projected/f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71-kube-api-access-9h9ld\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:17 crc kubenswrapper[4749]: I1001 13:24:17.059823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b604-account-create-8v4pd" event={"ID":"f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71","Type":"ContainerDied","Data":"4957d5c68be51059b2154b13c41c6b06cc7c9cc89bb57df029be534100a0d6bc"} Oct 01 13:24:17 crc kubenswrapper[4749]: I1001 13:24:17.059871 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4957d5c68be51059b2154b13c41c6b06cc7c9cc89bb57df029be534100a0d6bc" Oct 01 13:24:17 crc kubenswrapper[4749]: I1001 13:24:17.059920 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b604-account-create-8v4pd" Oct 01 13:24:17 crc kubenswrapper[4749]: I1001 13:24:17.063209 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-687d-account-create-9ssdm" Oct 01 13:24:17 crc kubenswrapper[4749]: I1001 13:24:17.063253 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-687d-account-create-9ssdm" event={"ID":"85e090bf-b7d2-4f65-b8c0-147f61612834","Type":"ContainerDied","Data":"2fe86bf8c29e60d1754b3cf887b0249181a50ae3938f86d9d8106c6bdfe28efc"} Oct 01 13:24:17 crc kubenswrapper[4749]: I1001 13:24:17.063594 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fe86bf8c29e60d1754b3cf887b0249181a50ae3938f86d9d8106c6bdfe28efc" Oct 01 13:24:17 crc kubenswrapper[4749]: I1001 13:24:17.065264 4749 generic.go:334] "Generic (PLEG): container finished" podID="c84137f7-d1b7-422d-b628-5d681da78cb2" containerID="c35baff8f2f004747b4989fb62000677180bec81bb0d98f91dfbfa4b9749847b" exitCode=0 Oct 01 13:24:17 crc kubenswrapper[4749]: I1001 13:24:17.065344 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sl4xv-config-kq5d2" event={"ID":"c84137f7-d1b7-422d-b628-5d681da78cb2","Type":"ContainerDied","Data":"c35baff8f2f004747b4989fb62000677180bec81bb0d98f91dfbfa4b9749847b"} Oct 01 13:24:17 crc kubenswrapper[4749]: I1001 13:24:17.067065 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.321688 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.356206 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6ef9f56d-2299-424f-9cc3-21cd7fcae8c1-etc-swift\") pod \"swift-storage-0\" (UID: \"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1\") " pod="openstack/swift-storage-0" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.462301 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.546818 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.626505 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-log-ovn\") pod \"c84137f7-d1b7-422d-b628-5d681da78cb2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.626652 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-run-ovn\") pod \"c84137f7-d1b7-422d-b628-5d681da78cb2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.626690 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c84137f7-d1b7-422d-b628-5d681da78cb2-scripts\") pod \"c84137f7-d1b7-422d-b628-5d681da78cb2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.626731 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j49dw\" (UniqueName: \"kubernetes.io/projected/c84137f7-d1b7-422d-b628-5d681da78cb2-kube-api-access-j49dw\") pod \"c84137f7-d1b7-422d-b628-5d681da78cb2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.626759 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-run\") pod \"c84137f7-d1b7-422d-b628-5d681da78cb2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.626835 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c84137f7-d1b7-422d-b628-5d681da78cb2-additional-scripts\") pod \"c84137f7-d1b7-422d-b628-5d681da78cb2\" (UID: \"c84137f7-d1b7-422d-b628-5d681da78cb2\") " Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.628270 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c84137f7-d1b7-422d-b628-5d681da78cb2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c84137f7-d1b7-422d-b628-5d681da78cb2" (UID: "c84137f7-d1b7-422d-b628-5d681da78cb2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.628311 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c84137f7-d1b7-422d-b628-5d681da78cb2" (UID: "c84137f7-d1b7-422d-b628-5d681da78cb2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.628328 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c84137f7-d1b7-422d-b628-5d681da78cb2" (UID: "c84137f7-d1b7-422d-b628-5d681da78cb2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.628759 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-run" (OuterVolumeSpecName: "var-run") pod "c84137f7-d1b7-422d-b628-5d681da78cb2" (UID: "c84137f7-d1b7-422d-b628-5d681da78cb2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.629046 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c84137f7-d1b7-422d-b628-5d681da78cb2-scripts" (OuterVolumeSpecName: "scripts") pod "c84137f7-d1b7-422d-b628-5d681da78cb2" (UID: "c84137f7-d1b7-422d-b628-5d681da78cb2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.633937 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c84137f7-d1b7-422d-b628-5d681da78cb2-kube-api-access-j49dw" (OuterVolumeSpecName: "kube-api-access-j49dw") pod "c84137f7-d1b7-422d-b628-5d681da78cb2" (UID: "c84137f7-d1b7-422d-b628-5d681da78cb2"). InnerVolumeSpecName "kube-api-access-j49dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.727685 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c84137f7-d1b7-422d-b628-5d681da78cb2-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.727713 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.727721 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.727731 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c84137f7-d1b7-422d-b628-5d681da78cb2-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.727740 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j49dw\" (UniqueName: \"kubernetes.io/projected/c84137f7-d1b7-422d-b628-5d681da78cb2-kube-api-access-j49dw\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.727750 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c84137f7-d1b7-422d-b628-5d681da78cb2-var-run\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.776316 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1a64-account-create-f5wl6"] Oct 01 13:24:18 crc kubenswrapper[4749]: E1001 13:24:18.776605 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e090bf-b7d2-4f65-b8c0-147f61612834" containerName="mariadb-account-create" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.776620 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e090bf-b7d2-4f65-b8c0-147f61612834" containerName="mariadb-account-create" Oct 01 13:24:18 crc kubenswrapper[4749]: E1001 13:24:18.776637 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c84137f7-d1b7-422d-b628-5d681da78cb2" containerName="ovn-config" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.776644 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c84137f7-d1b7-422d-b628-5d681da78cb2" containerName="ovn-config" Oct 01 13:24:18 crc kubenswrapper[4749]: E1001 13:24:18.776660 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71" containerName="mariadb-account-create" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.776667 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71" containerName="mariadb-account-create" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.776819 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71" containerName="mariadb-account-create" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.776838 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e090bf-b7d2-4f65-b8c0-147f61612834" containerName="mariadb-account-create" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.776850 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c84137f7-d1b7-422d-b628-5d681da78cb2" containerName="ovn-config" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.777339 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1a64-account-create-f5wl6" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.779031 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.786686 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1a64-account-create-f5wl6"] Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.844485 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 01 13:24:18 crc kubenswrapper[4749]: I1001 13:24:18.930507 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9drc\" (UniqueName: \"kubernetes.io/projected/dc5f870d-1b2f-4cc2-93c7-6375aef31397-kube-api-access-n9drc\") pod \"glance-1a64-account-create-f5wl6\" (UID: \"dc5f870d-1b2f-4cc2-93c7-6375aef31397\") " pod="openstack/glance-1a64-account-create-f5wl6" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.032036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9drc\" (UniqueName: \"kubernetes.io/projected/dc5f870d-1b2f-4cc2-93c7-6375aef31397-kube-api-access-n9drc\") pod \"glance-1a64-account-create-f5wl6\" (UID: \"dc5f870d-1b2f-4cc2-93c7-6375aef31397\") " pod="openstack/glance-1a64-account-create-f5wl6" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.052373 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9drc\" (UniqueName: \"kubernetes.io/projected/dc5f870d-1b2f-4cc2-93c7-6375aef31397-kube-api-access-n9drc\") pod \"glance-1a64-account-create-f5wl6\" (UID: \"dc5f870d-1b2f-4cc2-93c7-6375aef31397\") " pod="openstack/glance-1a64-account-create-f5wl6" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.078283 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"c850410abde3bfe62e7ed47c0ba6e0ba1af31d5133ebcee77e085819fc33e162"} Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.079968 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sl4xv-config-kq5d2" event={"ID":"c84137f7-d1b7-422d-b628-5d681da78cb2","Type":"ContainerDied","Data":"83f9b2697e229ffd98e91f83bbdca1df9e0413fdc6cdbd3523b4a15e5cc25537"} Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.080031 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83f9b2697e229ffd98e91f83bbdca1df9e0413fdc6cdbd3523b4a15e5cc25537" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.080009 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sl4xv-config-kq5d2" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.098821 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1a64-account-create-f5wl6" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.227961 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="35e1759f-e27a-4891-9fc0-37753b25689d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.309947 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.310177 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="prometheus" containerID="cri-o://7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161" gracePeriod=600 Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.310297 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="thanos-sidecar" containerID="cri-o://860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a" gracePeriod=600 Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.310352 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="config-reloader" containerID="cri-o://e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247" gracePeriod=600 Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.534079 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="1d655140-d63d-4e40-8de1-875213f37d4a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.564485 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1a64-account-create-f5wl6"] Oct 01 13:24:19 crc kubenswrapper[4749]: W1001 13:24:19.599645 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc5f870d_1b2f_4cc2_93c7_6375aef31397.slice/crio-d98b44ce9400dcd46897466598022a3a1ed0eb512b668dcf9597c9edd8672beb WatchSource:0}: Error finding container d98b44ce9400dcd46897466598022a3a1ed0eb512b668dcf9597c9edd8672beb: Status 404 returned error can't find the container with id d98b44ce9400dcd46897466598022a3a1ed0eb512b668dcf9597c9edd8672beb Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.676693 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-sl4xv-config-kq5d2"] Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.684846 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-sl4xv-config-kq5d2"] Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.800164 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-sl4xv-config-pwpt4"] Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.801196 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.822797 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sl4xv-config-pwpt4"] Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.822963 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.833729 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="89621c5c-1d46-44be-852f-1a37dccf02e9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.959339 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bdrb\" (UniqueName: \"kubernetes.io/projected/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-kube-api-access-6bdrb\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.959428 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-scripts\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.959527 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-run-ovn\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.959765 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-run\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.959825 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-log-ovn\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.959930 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-additional-scripts\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:19 crc kubenswrapper[4749]: I1001 13:24:19.992665 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.024491 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-sl4xv" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.061655 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-scripts\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.061703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-run-ovn\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.061774 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-run\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.061800 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-log-ovn\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.061838 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-additional-scripts\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.061860 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bdrb\" (UniqueName: \"kubernetes.io/projected/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-kube-api-access-6bdrb\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.062113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-run-ovn\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.062146 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-run\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.062113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-log-ovn\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.062591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-additional-scripts\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.064107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-scripts\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.087038 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1a64-account-create-f5wl6" event={"ID":"dc5f870d-1b2f-4cc2-93c7-6375aef31397","Type":"ContainerStarted","Data":"d98b44ce9400dcd46897466598022a3a1ed0eb512b668dcf9597c9edd8672beb"} Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.095064 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bdrb\" (UniqueName: \"kubernetes.io/projected/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-kube-api-access-6bdrb\") pod \"ovn-controller-sl4xv-config-pwpt4\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.097549 4749 generic.go:334] "Generic (PLEG): container finished" podID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerID="860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a" exitCode=0 Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.097572 4749 generic.go:334] "Generic (PLEG): container finished" podID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerID="e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247" exitCode=0 Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.097579 4749 generic.go:334] "Generic (PLEG): container finished" podID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerID="7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161" exitCode=0 Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.097598 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0620852a-daa1-4b0a-91ea-910dd2c379c9","Type":"ContainerDied","Data":"860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a"} Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.097624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0620852a-daa1-4b0a-91ea-910dd2c379c9","Type":"ContainerDied","Data":"e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247"} Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.097633 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0620852a-daa1-4b0a-91ea-910dd2c379c9","Type":"ContainerDied","Data":"7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161"} Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.097642 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0620852a-daa1-4b0a-91ea-910dd2c379c9","Type":"ContainerDied","Data":"da8ce3baa560a4664d972965bc675dd5e94205d13d6beb6b9674932cb85d3b8f"} Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.097664 4749 scope.go:117] "RemoveContainer" containerID="860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.097787 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.114078 4749 scope.go:117] "RemoveContainer" containerID="e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.155752 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.163699 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"0620852a-daa1-4b0a-91ea-910dd2c379c9\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.163797 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-web-config\") pod \"0620852a-daa1-4b0a-91ea-910dd2c379c9\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.163816 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-thanos-prometheus-http-client-file\") pod \"0620852a-daa1-4b0a-91ea-910dd2c379c9\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.163870 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0620852a-daa1-4b0a-91ea-910dd2c379c9-config-out\") pod \"0620852a-daa1-4b0a-91ea-910dd2c379c9\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.163895 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-config\") pod \"0620852a-daa1-4b0a-91ea-910dd2c379c9\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.163914 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0620852a-daa1-4b0a-91ea-910dd2c379c9-tls-assets\") pod \"0620852a-daa1-4b0a-91ea-910dd2c379c9\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.163988 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44qf4\" (UniqueName: \"kubernetes.io/projected/0620852a-daa1-4b0a-91ea-910dd2c379c9-kube-api-access-44qf4\") pod \"0620852a-daa1-4b0a-91ea-910dd2c379c9\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.164021 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0620852a-daa1-4b0a-91ea-910dd2c379c9-prometheus-metric-storage-rulefiles-0\") pod \"0620852a-daa1-4b0a-91ea-910dd2c379c9\" (UID: \"0620852a-daa1-4b0a-91ea-910dd2c379c9\") " Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.166900 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0620852a-daa1-4b0a-91ea-910dd2c379c9-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "0620852a-daa1-4b0a-91ea-910dd2c379c9" (UID: "0620852a-daa1-4b0a-91ea-910dd2c379c9"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.174483 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0620852a-daa1-4b0a-91ea-910dd2c379c9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0620852a-daa1-4b0a-91ea-910dd2c379c9" (UID: "0620852a-daa1-4b0a-91ea-910dd2c379c9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.178350 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "0620852a-daa1-4b0a-91ea-910dd2c379c9" (UID: "0620852a-daa1-4b0a-91ea-910dd2c379c9"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.181433 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-config" (OuterVolumeSpecName: "config") pod "0620852a-daa1-4b0a-91ea-910dd2c379c9" (UID: "0620852a-daa1-4b0a-91ea-910dd2c379c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.182471 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0620852a-daa1-4b0a-91ea-910dd2c379c9-kube-api-access-44qf4" (OuterVolumeSpecName: "kube-api-access-44qf4") pod "0620852a-daa1-4b0a-91ea-910dd2c379c9" (UID: "0620852a-daa1-4b0a-91ea-910dd2c379c9"). InnerVolumeSpecName "kube-api-access-44qf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.184377 4749 scope.go:117] "RemoveContainer" containerID="7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.184546 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0620852a-daa1-4b0a-91ea-910dd2c379c9-config-out" (OuterVolumeSpecName: "config-out") pod "0620852a-daa1-4b0a-91ea-910dd2c379c9" (UID: "0620852a-daa1-4b0a-91ea-910dd2c379c9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.243876 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-web-config" (OuterVolumeSpecName: "web-config") pod "0620852a-daa1-4b0a-91ea-910dd2c379c9" (UID: "0620852a-daa1-4b0a-91ea-910dd2c379c9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.257565 4749 scope.go:117] "RemoveContainer" containerID="b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.265444 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.265473 4749 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0620852a-daa1-4b0a-91ea-910dd2c379c9-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.265484 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44qf4\" (UniqueName: \"kubernetes.io/projected/0620852a-daa1-4b0a-91ea-910dd2c379c9-kube-api-access-44qf4\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.265494 4749 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0620852a-daa1-4b0a-91ea-910dd2c379c9-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.265503 4749 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-web-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.265512 4749 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0620852a-daa1-4b0a-91ea-910dd2c379c9-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.265520 4749 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0620852a-daa1-4b0a-91ea-910dd2c379c9-config-out\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.320855 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "0620852a-daa1-4b0a-91ea-910dd2c379c9" (UID: "0620852a-daa1-4b0a-91ea-910dd2c379c9"). InnerVolumeSpecName "pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.344462 4749 scope.go:117] "RemoveContainer" containerID="860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a" Oct 01 13:24:20 crc kubenswrapper[4749]: E1001 13:24:20.347914 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a\": container with ID starting with 860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a not found: ID does not exist" containerID="860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.347959 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a"} err="failed to get container status \"860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a\": rpc error: code = NotFound desc = could not find container \"860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a\": container with ID starting with 860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a not found: ID does not exist" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.347985 4749 scope.go:117] "RemoveContainer" containerID="e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247" Oct 01 13:24:20 crc kubenswrapper[4749]: E1001 13:24:20.348746 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247\": container with ID starting with e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247 not found: ID does not exist" containerID="e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.348779 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247"} err="failed to get container status \"e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247\": rpc error: code = NotFound desc = could not find container \"e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247\": container with ID starting with e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247 not found: ID does not exist" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.348803 4749 scope.go:117] "RemoveContainer" containerID="7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161" Oct 01 13:24:20 crc kubenswrapper[4749]: E1001 13:24:20.352295 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161\": container with ID starting with 7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161 not found: ID does not exist" containerID="7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.352316 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161"} err="failed to get container status \"7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161\": rpc error: code = NotFound desc = could not find container \"7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161\": container with ID starting with 7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161 not found: ID does not exist" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.352331 4749 scope.go:117] "RemoveContainer" containerID="b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed" Oct 01 13:24:20 crc kubenswrapper[4749]: E1001 13:24:20.352537 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed\": container with ID starting with b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed not found: ID does not exist" containerID="b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.352567 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed"} err="failed to get container status \"b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed\": rpc error: code = NotFound desc = could not find container \"b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed\": container with ID starting with b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed not found: ID does not exist" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.352582 4749 scope.go:117] "RemoveContainer" containerID="860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.352766 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a"} err="failed to get container status \"860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a\": rpc error: code = NotFound desc = could not find container \"860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a\": container with ID starting with 860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a not found: ID does not exist" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.352779 4749 scope.go:117] "RemoveContainer" containerID="e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.352966 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247"} err="failed to get container status \"e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247\": rpc error: code = NotFound desc = could not find container \"e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247\": container with ID starting with e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247 not found: ID does not exist" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.352993 4749 scope.go:117] "RemoveContainer" containerID="7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.353182 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161"} err="failed to get container status \"7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161\": rpc error: code = NotFound desc = could not find container \"7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161\": container with ID starting with 7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161 not found: ID does not exist" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.353196 4749 scope.go:117] "RemoveContainer" containerID="b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.353409 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed"} err="failed to get container status \"b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed\": rpc error: code = NotFound desc = could not find container \"b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed\": container with ID starting with b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed not found: ID does not exist" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.353422 4749 scope.go:117] "RemoveContainer" containerID="860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.353610 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a"} err="failed to get container status \"860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a\": rpc error: code = NotFound desc = could not find container \"860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a\": container with ID starting with 860b58e2bcbdb0426a71cf27612684ac195887076313e4d1646edcd4f5fbb63a not found: ID does not exist" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.353637 4749 scope.go:117] "RemoveContainer" containerID="e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.353810 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247"} err="failed to get container status \"e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247\": rpc error: code = NotFound desc = could not find container \"e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247\": container with ID starting with e8b0c4b151d6f207c71ac3a7b2eb50a11c4c07f5592383ef7dac410f07d9f247 not found: ID does not exist" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.353822 4749 scope.go:117] "RemoveContainer" containerID="7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.354011 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161"} err="failed to get container status \"7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161\": rpc error: code = NotFound desc = could not find container \"7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161\": container with ID starting with 7d22fc4b075fd12dfdac27f770b32397123d281312f0302b4d9befd4aa8dd161 not found: ID does not exist" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.354026 4749 scope.go:117] "RemoveContainer" containerID="b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.355330 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed"} err="failed to get container status \"b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed\": rpc error: code = NotFound desc = could not find container \"b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed\": container with ID starting with b2b511be8ee093c5a3cfee2c39b9e4e8193bf927baf91373ce7bb8c60a1fcaed not found: ID does not exist" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.366688 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") on node \"crc\" " Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.391946 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.392093 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a") on node "crc" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.460895 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.471335 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.480100 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.505949 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:24:20 crc kubenswrapper[4749]: E1001 13:24:20.506301 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="prometheus" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.506313 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="prometheus" Oct 01 13:24:20 crc kubenswrapper[4749]: E1001 13:24:20.506324 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="thanos-sidecar" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.506330 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="thanos-sidecar" Oct 01 13:24:20 crc kubenswrapper[4749]: E1001 13:24:20.506342 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="config-reloader" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.506348 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="config-reloader" Oct 01 13:24:20 crc kubenswrapper[4749]: E1001 13:24:20.506362 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="init-config-reloader" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.506368 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="init-config-reloader" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.506522 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="config-reloader" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.506536 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="prometheus" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.506547 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" containerName="thanos-sidecar" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.512069 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.515142 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.515205 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.515312 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.515525 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.515680 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-df59c" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.515913 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.525862 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.570953 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-sl4xv-config-pwpt4"] Oct 01 13:24:20 crc kubenswrapper[4749]: W1001 13:24:20.600487 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c064aaa_3d55_43d2_a5da_b0b67f7f6996.slice/crio-e9637bba01a8e7b85bfdb2103c2d828bb22ac0ec7e58f088622a15f806f13ad1 WatchSource:0}: Error finding container e9637bba01a8e7b85bfdb2103c2d828bb22ac0ec7e58f088622a15f806f13ad1: Status 404 returned error can't find the container with id e9637bba01a8e7b85bfdb2103c2d828bb22ac0ec7e58f088622a15f806f13ad1 Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.601361 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.677078 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.677131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.677495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6n79\" (UniqueName: \"kubernetes.io/projected/bce57e9e-7e46-4ac2-a709-e978a98e4575-kube-api-access-g6n79\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.677580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.677692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bce57e9e-7e46-4ac2-a709-e978a98e4575-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.677742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bce57e9e-7e46-4ac2-a709-e978a98e4575-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.677790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.677853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bce57e9e-7e46-4ac2-a709-e978a98e4575-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.677892 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.677945 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.677969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-config\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.779065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.779124 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bce57e9e-7e46-4ac2-a709-e978a98e4575-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.779157 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.779190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.779209 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-config\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.779268 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.779288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.779309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6n79\" (UniqueName: \"kubernetes.io/projected/bce57e9e-7e46-4ac2-a709-e978a98e4575-kube-api-access-g6n79\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.779338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.779373 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bce57e9e-7e46-4ac2-a709-e978a98e4575-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.779396 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bce57e9e-7e46-4ac2-a709-e978a98e4575-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.780117 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bce57e9e-7e46-4ac2-a709-e978a98e4575-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.785591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bce57e9e-7e46-4ac2-a709-e978a98e4575-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.786317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bce57e9e-7e46-4ac2-a709-e978a98e4575-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.786774 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.787063 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.787136 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.787175 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/486c2550edb1c82035b2963cc60708287426e0ee5d361c9f2f060b32a3c68a50/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.787194 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-config\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.788837 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.792533 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.794536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.812917 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6n79\" (UniqueName: \"kubernetes.io/projected/bce57e9e-7e46-4ac2-a709-e978a98e4575-kube-api-access-g6n79\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.848345 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"prometheus-metric-storage-0\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:20 crc kubenswrapper[4749]: I1001 13:24:20.860469 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:21 crc kubenswrapper[4749]: I1001 13:24:21.130107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"ee9443a350dd08218dccc56de033dc3005da555edab88fdebcb436e4cb6e23c6"} Oct 01 13:24:21 crc kubenswrapper[4749]: I1001 13:24:21.130473 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"86f99c34abce6beb215828b05a9aead61e5399fcf4a7d8058c248c2d7d0f1857"} Oct 01 13:24:21 crc kubenswrapper[4749]: I1001 13:24:21.130486 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"f8407a1aca9e47a5bd1a3a036db0b0ae557522a8053fbb7456110a4a54e90192"} Oct 01 13:24:21 crc kubenswrapper[4749]: I1001 13:24:21.131623 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc5f870d-1b2f-4cc2-93c7-6375aef31397" containerID="b80ac3918b2c3cbc347792ef710b30513ad2f7f90512c91ee6b469b1520aee00" exitCode=0 Oct 01 13:24:21 crc kubenswrapper[4749]: I1001 13:24:21.131683 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1a64-account-create-f5wl6" event={"ID":"dc5f870d-1b2f-4cc2-93c7-6375aef31397","Type":"ContainerDied","Data":"b80ac3918b2c3cbc347792ef710b30513ad2f7f90512c91ee6b469b1520aee00"} Oct 01 13:24:21 crc kubenswrapper[4749]: I1001 13:24:21.139360 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sl4xv-config-pwpt4" event={"ID":"1c064aaa-3d55-43d2-a5da-b0b67f7f6996","Type":"ContainerStarted","Data":"d8145983aaf7b8e562fd6c9dcf52eb6183bcc88850615e2454d908616a11b3ee"} Oct 01 13:24:21 crc kubenswrapper[4749]: I1001 13:24:21.139396 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sl4xv-config-pwpt4" event={"ID":"1c064aaa-3d55-43d2-a5da-b0b67f7f6996","Type":"ContainerStarted","Data":"e9637bba01a8e7b85bfdb2103c2d828bb22ac0ec7e58f088622a15f806f13ad1"} Oct 01 13:24:21 crc kubenswrapper[4749]: I1001 13:24:21.219409 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-sl4xv-config-pwpt4" podStartSLOduration=2.219382905 podStartE2EDuration="2.219382905s" podCreationTimestamp="2025-10-01 13:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:24:21.196023627 +0000 UTC m=+1121.250008526" watchObservedRunningTime="2025-10-01 13:24:21.219382905 +0000 UTC m=+1121.273367804" Oct 01 13:24:21 crc kubenswrapper[4749]: I1001 13:24:21.244529 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0620852a-daa1-4b0a-91ea-910dd2c379c9" path="/var/lib/kubelet/pods/0620852a-daa1-4b0a-91ea-910dd2c379c9/volumes" Oct 01 13:24:21 crc kubenswrapper[4749]: I1001 13:24:21.245166 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c84137f7-d1b7-422d-b628-5d681da78cb2" path="/var/lib/kubelet/pods/c84137f7-d1b7-422d-b628-5d681da78cb2/volumes" Oct 01 13:24:21 crc kubenswrapper[4749]: I1001 13:24:21.334634 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:24:21 crc kubenswrapper[4749]: W1001 13:24:21.344178 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbce57e9e_7e46_4ac2_a709_e978a98e4575.slice/crio-aababca1d28fc74c9a344b5824e68f9ee0f357bc46629e420429ddf9f32190f4 WatchSource:0}: Error finding container aababca1d28fc74c9a344b5824e68f9ee0f357bc46629e420429ddf9f32190f4: Status 404 returned error can't find the container with id aababca1d28fc74c9a344b5824e68f9ee0f357bc46629e420429ddf9f32190f4 Oct 01 13:24:22 crc kubenswrapper[4749]: I1001 13:24:22.148357 4749 generic.go:334] "Generic (PLEG): container finished" podID="1c064aaa-3d55-43d2-a5da-b0b67f7f6996" containerID="d8145983aaf7b8e562fd6c9dcf52eb6183bcc88850615e2454d908616a11b3ee" exitCode=0 Oct 01 13:24:22 crc kubenswrapper[4749]: I1001 13:24:22.148439 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sl4xv-config-pwpt4" event={"ID":"1c064aaa-3d55-43d2-a5da-b0b67f7f6996","Type":"ContainerDied","Data":"d8145983aaf7b8e562fd6c9dcf52eb6183bcc88850615e2454d908616a11b3ee"} Oct 01 13:24:22 crc kubenswrapper[4749]: I1001 13:24:22.154002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"9b14d2f3c774ca4a3d734d1d0fc97d56cc8d7246d2d9ca590e87710059f83407"} Oct 01 13:24:22 crc kubenswrapper[4749]: I1001 13:24:22.154073 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"ad7d58c7014f726cbfbc9df25a939b523b4f1652c9099fe2d42edf56e61e1547"} Oct 01 13:24:22 crc kubenswrapper[4749]: I1001 13:24:22.155692 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bce57e9e-7e46-4ac2-a709-e978a98e4575","Type":"ContainerStarted","Data":"aababca1d28fc74c9a344b5824e68f9ee0f357bc46629e420429ddf9f32190f4"} Oct 01 13:24:22 crc kubenswrapper[4749]: I1001 13:24:22.549983 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1a64-account-create-f5wl6" Oct 01 13:24:22 crc kubenswrapper[4749]: I1001 13:24:22.710579 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9drc\" (UniqueName: \"kubernetes.io/projected/dc5f870d-1b2f-4cc2-93c7-6375aef31397-kube-api-access-n9drc\") pod \"dc5f870d-1b2f-4cc2-93c7-6375aef31397\" (UID: \"dc5f870d-1b2f-4cc2-93c7-6375aef31397\") " Oct 01 13:24:22 crc kubenswrapper[4749]: I1001 13:24:22.720516 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc5f870d-1b2f-4cc2-93c7-6375aef31397-kube-api-access-n9drc" (OuterVolumeSpecName: "kube-api-access-n9drc") pod "dc5f870d-1b2f-4cc2-93c7-6375aef31397" (UID: "dc5f870d-1b2f-4cc2-93c7-6375aef31397"). InnerVolumeSpecName "kube-api-access-n9drc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:22 crc kubenswrapper[4749]: I1001 13:24:22.812698 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9drc\" (UniqueName: \"kubernetes.io/projected/dc5f870d-1b2f-4cc2-93c7-6375aef31397-kube-api-access-n9drc\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.172590 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1a64-account-create-f5wl6" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.172757 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1a64-account-create-f5wl6" event={"ID":"dc5f870d-1b2f-4cc2-93c7-6375aef31397","Type":"ContainerDied","Data":"d98b44ce9400dcd46897466598022a3a1ed0eb512b668dcf9597c9edd8672beb"} Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.173794 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d98b44ce9400dcd46897466598022a3a1ed0eb512b668dcf9597c9edd8672beb" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.175540 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"7fb88ca105238bdb58fd11ab6057636a12bc47bc003a39f0f579d0cb55fdf77d"} Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.175569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"11846f1bb564f4cf165f1c08b6cd534a8922a5009554ed493a5d11130f2b3788"} Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.175579 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"bca495033a9d8f1f3f06c73bd139ed0da2bc0371c108f4199c3f4f6e01a766e3"} Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.699653 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.850460 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-run-ovn\") pod \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.851072 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-run\") pod \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.851153 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bdrb\" (UniqueName: \"kubernetes.io/projected/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-kube-api-access-6bdrb\") pod \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.851198 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-log-ovn\") pod \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.851256 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-scripts\") pod \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.851432 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-additional-scripts\") pod \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\" (UID: \"1c064aaa-3d55-43d2-a5da-b0b67f7f6996\") " Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.850577 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1c064aaa-3d55-43d2-a5da-b0b67f7f6996" (UID: "1c064aaa-3d55-43d2-a5da-b0b67f7f6996"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.852559 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1c064aaa-3d55-43d2-a5da-b0b67f7f6996" (UID: "1c064aaa-3d55-43d2-a5da-b0b67f7f6996"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.852614 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-run" (OuterVolumeSpecName: "var-run") pod "1c064aaa-3d55-43d2-a5da-b0b67f7f6996" (UID: "1c064aaa-3d55-43d2-a5da-b0b67f7f6996"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.854610 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1c064aaa-3d55-43d2-a5da-b0b67f7f6996" (UID: "1c064aaa-3d55-43d2-a5da-b0b67f7f6996"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.855412 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-scripts" (OuterVolumeSpecName: "scripts") pod "1c064aaa-3d55-43d2-a5da-b0b67f7f6996" (UID: "1c064aaa-3d55-43d2-a5da-b0b67f7f6996"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.861083 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-kube-api-access-6bdrb" (OuterVolumeSpecName: "kube-api-access-6bdrb") pod "1c064aaa-3d55-43d2-a5da-b0b67f7f6996" (UID: "1c064aaa-3d55-43d2-a5da-b0b67f7f6996"). InnerVolumeSpecName "kube-api-access-6bdrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.930835 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-d55n6"] Oct 01 13:24:23 crc kubenswrapper[4749]: E1001 13:24:23.931275 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c064aaa-3d55-43d2-a5da-b0b67f7f6996" containerName="ovn-config" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.931296 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c064aaa-3d55-43d2-a5da-b0b67f7f6996" containerName="ovn-config" Oct 01 13:24:23 crc kubenswrapper[4749]: E1001 13:24:23.931313 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc5f870d-1b2f-4cc2-93c7-6375aef31397" containerName="mariadb-account-create" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.931321 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc5f870d-1b2f-4cc2-93c7-6375aef31397" containerName="mariadb-account-create" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.931568 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c064aaa-3d55-43d2-a5da-b0b67f7f6996" containerName="ovn-config" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.931609 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc5f870d-1b2f-4cc2-93c7-6375aef31397" containerName="mariadb-account-create" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.932330 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.934274 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mvrfw" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.940754 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.945025 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-d55n6"] Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.957461 4749 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.957491 4749 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-run\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.957501 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bdrb\" (UniqueName: \"kubernetes.io/projected/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-kube-api-access-6bdrb\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.957510 4749 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.957518 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:23 crc kubenswrapper[4749]: I1001 13:24:23.957526 4749 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c064aaa-3d55-43d2-a5da-b0b67f7f6996-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.058963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-db-sync-config-data\") pod \"glance-db-sync-d55n6\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.059056 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-combined-ca-bundle\") pod \"glance-db-sync-d55n6\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.059078 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-config-data\") pod \"glance-db-sync-d55n6\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.059111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8l5g\" (UniqueName: \"kubernetes.io/projected/7bcae573-48fa-4920-8b8f-4df57d4c5375-kube-api-access-p8l5g\") pod \"glance-db-sync-d55n6\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.160415 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-db-sync-config-data\") pod \"glance-db-sync-d55n6\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.160499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-combined-ca-bundle\") pod \"glance-db-sync-d55n6\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.160519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-config-data\") pod \"glance-db-sync-d55n6\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.160552 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8l5g\" (UniqueName: \"kubernetes.io/projected/7bcae573-48fa-4920-8b8f-4df57d4c5375-kube-api-access-p8l5g\") pod \"glance-db-sync-d55n6\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.164615 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-combined-ca-bundle\") pod \"glance-db-sync-d55n6\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.164900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-db-sync-config-data\") pod \"glance-db-sync-d55n6\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.166322 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-config-data\") pod \"glance-db-sync-d55n6\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.174716 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8l5g\" (UniqueName: \"kubernetes.io/projected/7bcae573-48fa-4920-8b8f-4df57d4c5375-kube-api-access-p8l5g\") pod \"glance-db-sync-d55n6\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.185171 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-sl4xv-config-pwpt4" event={"ID":"1c064aaa-3d55-43d2-a5da-b0b67f7f6996","Type":"ContainerDied","Data":"e9637bba01a8e7b85bfdb2103c2d828bb22ac0ec7e58f088622a15f806f13ad1"} Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.185204 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9637bba01a8e7b85bfdb2103c2d828bb22ac0ec7e58f088622a15f806f13ad1" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.185234 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-sl4xv-config-pwpt4" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.191092 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"c38a05ec2113f67abb13b775f25ad893291d54a7a45fc9557b78bdbb67bd1a80"} Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.191159 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"caeef81bb4fe47220b139880e5cd7ad1e45cc0c4aad270a935c076a05dd44fcd"} Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.192456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bce57e9e-7e46-4ac2-a709-e978a98e4575","Type":"ContainerStarted","Data":"fad1cb8eb096fc62719b51016d81f562e1e5098c6f44ac776c43ab82da9ec44d"} Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.258113 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-d55n6" Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.633362 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-d55n6"] Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.798065 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-sl4xv-config-pwpt4"] Oct 01 13:24:24 crc kubenswrapper[4749]: I1001 13:24:24.805980 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-sl4xv-config-pwpt4"] Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.210868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-d55n6" event={"ID":"7bcae573-48fa-4920-8b8f-4df57d4c5375","Type":"ContainerStarted","Data":"41f2867c999513ee36702ba407a9edb9d82cd0bda93f1d3e9ef23ab73ad63b97"} Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.241003 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c064aaa-3d55-43d2-a5da-b0b67f7f6996" path="/var/lib/kubelet/pods/1c064aaa-3d55-43d2-a5da-b0b67f7f6996/volumes" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.242074 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"f8aa26737607dac33811c347a3ede247f7158a18a4e0d697df0bcf2571f81d34"} Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.242109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"1909488e708bb30f6197e19611bae32bcc8a2d2a115e1db3a3454896f12dac81"} Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.242120 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"2d5124fd80bdf77db796301890540a83e27b827cb4f7e72697b705b3aef674d4"} Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.242130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"a725fd944f66ff323d71d92112d99781ea42156d036126591b8fc3c3c9ea5450"} Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.242139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"6ef9f56d-2299-424f-9cc3-21cd7fcae8c1","Type":"ContainerStarted","Data":"6d6d899e0f45f45a69a5d7df12c5e2c3f4e7d754d91d6057ef7b306f487d4b7b"} Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.274383 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.407095666000004 podStartE2EDuration="40.27432749s" podCreationTimestamp="2025-10-01 13:23:45 +0000 UTC" firstStartedPulling="2025-10-01 13:24:18.856786929 +0000 UTC m=+1118.910771828" lastFinishedPulling="2025-10-01 13:24:23.724018753 +0000 UTC m=+1123.778003652" observedRunningTime="2025-10-01 13:24:25.264690677 +0000 UTC m=+1125.318675586" watchObservedRunningTime="2025-10-01 13:24:25.27432749 +0000 UTC m=+1125.328312389" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.567569 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f4bccb99-dh9mp"] Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.569247 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.577837 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.589197 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f4bccb99-dh9mp"] Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.690168 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-config\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.690271 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-dns-svc\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.690336 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z64tx\" (UniqueName: \"kubernetes.io/projected/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-kube-api-access-z64tx\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.690369 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.690403 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-dns-swift-storage-0\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.690462 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.791643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.791722 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-config\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.791778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-dns-svc\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.791845 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z64tx\" (UniqueName: \"kubernetes.io/projected/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-kube-api-access-z64tx\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.791882 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.791922 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-dns-swift-storage-0\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.793017 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-dns-swift-storage-0\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.793438 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-ovsdbserver-sb\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.793693 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-dns-svc\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.794325 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-config\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.794410 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-ovsdbserver-nb\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.812739 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z64tx\" (UniqueName: \"kubernetes.io/projected/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-kube-api-access-z64tx\") pod \"dnsmasq-dns-5f4bccb99-dh9mp\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:25 crc kubenswrapper[4749]: I1001 13:24:25.890687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:26 crc kubenswrapper[4749]: W1001 13:24:26.384514 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9138f50d_1c41_4a52_84f6_2e3fa1e7f245.slice/crio-709e9de2268a803099cc88ccc907480b8bf03330027c039cbe1a8c622a601b57 WatchSource:0}: Error finding container 709e9de2268a803099cc88ccc907480b8bf03330027c039cbe1a8c622a601b57: Status 404 returned error can't find the container with id 709e9de2268a803099cc88ccc907480b8bf03330027c039cbe1a8c622a601b57 Oct 01 13:24:26 crc kubenswrapper[4749]: I1001 13:24:26.385677 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f4bccb99-dh9mp"] Oct 01 13:24:27 crc kubenswrapper[4749]: I1001 13:24:27.251319 4749 generic.go:334] "Generic (PLEG): container finished" podID="9138f50d-1c41-4a52-84f6-2e3fa1e7f245" containerID="151966dc1dc660950b8d52ac40df122796bac5e5eea85049d947ad7e333fad29" exitCode=0 Oct 01 13:24:27 crc kubenswrapper[4749]: I1001 13:24:27.251394 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" event={"ID":"9138f50d-1c41-4a52-84f6-2e3fa1e7f245","Type":"ContainerDied","Data":"151966dc1dc660950b8d52ac40df122796bac5e5eea85049d947ad7e333fad29"} Oct 01 13:24:27 crc kubenswrapper[4749]: I1001 13:24:27.251612 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" event={"ID":"9138f50d-1c41-4a52-84f6-2e3fa1e7f245","Type":"ContainerStarted","Data":"709e9de2268a803099cc88ccc907480b8bf03330027c039cbe1a8c622a601b57"} Oct 01 13:24:28 crc kubenswrapper[4749]: I1001 13:24:28.264078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" event={"ID":"9138f50d-1c41-4a52-84f6-2e3fa1e7f245","Type":"ContainerStarted","Data":"f7713d48470e3e620d6bdb4a22f20f9c3c9dfc31a1997b43431b79f3cbb5dac5"} Oct 01 13:24:28 crc kubenswrapper[4749]: I1001 13:24:28.264236 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:28 crc kubenswrapper[4749]: I1001 13:24:28.289388 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" podStartSLOduration=3.289365776 podStartE2EDuration="3.289365776s" podCreationTimestamp="2025-10-01 13:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:24:28.283433314 +0000 UTC m=+1128.337418223" watchObservedRunningTime="2025-10-01 13:24:28.289365776 +0000 UTC m=+1128.343350685" Oct 01 13:24:29 crc kubenswrapper[4749]: I1001 13:24:29.227804 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:24:29 crc kubenswrapper[4749]: I1001 13:24:29.533466 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Oct 01 13:24:29 crc kubenswrapper[4749]: I1001 13:24:29.819128 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 13:24:30 crc kubenswrapper[4749]: I1001 13:24:30.285913 4749 generic.go:334] "Generic (PLEG): container finished" podID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerID="fad1cb8eb096fc62719b51016d81f562e1e5098c6f44ac776c43ab82da9ec44d" exitCode=0 Oct 01 13:24:30 crc kubenswrapper[4749]: I1001 13:24:30.285946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bce57e9e-7e46-4ac2-a709-e978a98e4575","Type":"ContainerDied","Data":"fad1cb8eb096fc62719b51016d81f562e1e5098c6f44ac776c43ab82da9ec44d"} Oct 01 13:24:31 crc kubenswrapper[4749]: I1001 13:24:31.314157 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bce57e9e-7e46-4ac2-a709-e978a98e4575","Type":"ContainerStarted","Data":"a5ddceb0103edfb2ed73b6d0cc64290e1dc2ddda8428da5528f36c1520142145"} Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.105935 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.105998 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.106042 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.106831 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"910cd885f70644be5e766281d9b0c0085bea0ad6a3102c2969a829e2725fb191"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.106896 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://910cd885f70644be5e766281d9b0c0085bea0ad6a3102c2969a829e2725fb191" gracePeriod=600 Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.284752 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-rws79"] Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.285768 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rws79" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.313434 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-rws79"] Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.322705 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swr2t\" (UniqueName: \"kubernetes.io/projected/a2694945-782c-44b4-9613-4b5adb24c52f-kube-api-access-swr2t\") pod \"cinder-db-create-rws79\" (UID: \"a2694945-782c-44b4-9613-4b5adb24c52f\") " pod="openstack/cinder-db-create-rws79" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.331140 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="910cd885f70644be5e766281d9b0c0085bea0ad6a3102c2969a829e2725fb191" exitCode=0 Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.331179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"910cd885f70644be5e766281d9b0c0085bea0ad6a3102c2969a829e2725fb191"} Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.331212 4749 scope.go:117] "RemoveContainer" containerID="8d0168c71ffab7f25d1f8d0338f65483c24fac9eb00fb19dbac8b302415e4b3e" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.405358 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-8whfl"] Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.406343 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.410607 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.410785 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-nk64q" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.424446 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-combined-ca-bundle\") pod \"watcher-db-sync-8whfl\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.424490 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-config-data\") pod \"watcher-db-sync-8whfl\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.424529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95lvb\" (UniqueName: \"kubernetes.io/projected/9896d0f7-92a6-46b0-88ce-b64b390998c5-kube-api-access-95lvb\") pod \"watcher-db-sync-8whfl\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.424553 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-db-sync-config-data\") pod \"watcher-db-sync-8whfl\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.424637 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swr2t\" (UniqueName: \"kubernetes.io/projected/a2694945-782c-44b4-9613-4b5adb24c52f-kube-api-access-swr2t\") pod \"cinder-db-create-rws79\" (UID: \"a2694945-782c-44b4-9613-4b5adb24c52f\") " pod="openstack/cinder-db-create-rws79" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.424830 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-8whfl"] Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.475315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swr2t\" (UniqueName: \"kubernetes.io/projected/a2694945-782c-44b4-9613-4b5adb24c52f-kube-api-access-swr2t\") pod \"cinder-db-create-rws79\" (UID: \"a2694945-782c-44b4-9613-4b5adb24c52f\") " pod="openstack/cinder-db-create-rws79" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.487518 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6vcqk"] Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.494844 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6vcqk" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.502736 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6vcqk"] Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.527108 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-config-data\") pod \"watcher-db-sync-8whfl\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.527165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95lvb\" (UniqueName: \"kubernetes.io/projected/9896d0f7-92a6-46b0-88ce-b64b390998c5-kube-api-access-95lvb\") pod \"watcher-db-sync-8whfl\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.527194 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-db-sync-config-data\") pod \"watcher-db-sync-8whfl\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.527306 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9c2h\" (UniqueName: \"kubernetes.io/projected/13a817a5-ab51-4aa9-a26d-81f451d600d1-kube-api-access-q9c2h\") pod \"barbican-db-create-6vcqk\" (UID: \"13a817a5-ab51-4aa9-a26d-81f451d600d1\") " pod="openstack/barbican-db-create-6vcqk" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.527326 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-combined-ca-bundle\") pod \"watcher-db-sync-8whfl\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.533728 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-db-sync-config-data\") pod \"watcher-db-sync-8whfl\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.534289 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-config-data\") pod \"watcher-db-sync-8whfl\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.552643 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-combined-ca-bundle\") pod \"watcher-db-sync-8whfl\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.561188 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95lvb\" (UniqueName: \"kubernetes.io/projected/9896d0f7-92a6-46b0-88ce-b64b390998c5-kube-api-access-95lvb\") pod \"watcher-db-sync-8whfl\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.561854 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-5lxgd"] Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.562993 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5lxgd" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.574707 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5lxgd"] Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.600031 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rws79" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.628671 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9c2h\" (UniqueName: \"kubernetes.io/projected/13a817a5-ab51-4aa9-a26d-81f451d600d1-kube-api-access-q9c2h\") pod \"barbican-db-create-6vcqk\" (UID: \"13a817a5-ab51-4aa9-a26d-81f451d600d1\") " pod="openstack/barbican-db-create-6vcqk" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.628799 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr4l7\" (UniqueName: \"kubernetes.io/projected/f26e2be9-292a-4e1d-8ef4-98c9d9989cb0-kube-api-access-sr4l7\") pod \"neutron-db-create-5lxgd\" (UID: \"f26e2be9-292a-4e1d-8ef4-98c9d9989cb0\") " pod="openstack/neutron-db-create-5lxgd" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.644486 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-mndqc"] Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.646409 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.651816 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.652039 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-776bb" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.652234 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.652368 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.663114 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mndqc"] Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.730959 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7bb7cd0-7579-44b5-bac2-ae93c122858a-config-data\") pod \"keystone-db-sync-mndqc\" (UID: \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\") " pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.731047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr4l7\" (UniqueName: \"kubernetes.io/projected/f26e2be9-292a-4e1d-8ef4-98c9d9989cb0-kube-api-access-sr4l7\") pod \"neutron-db-create-5lxgd\" (UID: \"f26e2be9-292a-4e1d-8ef4-98c9d9989cb0\") " pod="openstack/neutron-db-create-5lxgd" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.731071 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjj6z\" (UniqueName: \"kubernetes.io/projected/f7bb7cd0-7579-44b5-bac2-ae93c122858a-kube-api-access-kjj6z\") pod \"keystone-db-sync-mndqc\" (UID: \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\") " pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.731172 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7bb7cd0-7579-44b5-bac2-ae93c122858a-combined-ca-bundle\") pod \"keystone-db-sync-mndqc\" (UID: \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\") " pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.732140 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.744974 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9c2h\" (UniqueName: \"kubernetes.io/projected/13a817a5-ab51-4aa9-a26d-81f451d600d1-kube-api-access-q9c2h\") pod \"barbican-db-create-6vcqk\" (UID: \"13a817a5-ab51-4aa9-a26d-81f451d600d1\") " pod="openstack/barbican-db-create-6vcqk" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.758187 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr4l7\" (UniqueName: \"kubernetes.io/projected/f26e2be9-292a-4e1d-8ef4-98c9d9989cb0-kube-api-access-sr4l7\") pod \"neutron-db-create-5lxgd\" (UID: \"f26e2be9-292a-4e1d-8ef4-98c9d9989cb0\") " pod="openstack/neutron-db-create-5lxgd" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.814125 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6vcqk" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.832860 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7bb7cd0-7579-44b5-bac2-ae93c122858a-combined-ca-bundle\") pod \"keystone-db-sync-mndqc\" (UID: \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\") " pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.832930 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7bb7cd0-7579-44b5-bac2-ae93c122858a-config-data\") pod \"keystone-db-sync-mndqc\" (UID: \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\") " pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.832992 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjj6z\" (UniqueName: \"kubernetes.io/projected/f7bb7cd0-7579-44b5-bac2-ae93c122858a-kube-api-access-kjj6z\") pod \"keystone-db-sync-mndqc\" (UID: \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\") " pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.837419 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7bb7cd0-7579-44b5-bac2-ae93c122858a-config-data\") pod \"keystone-db-sync-mndqc\" (UID: \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\") " pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.842760 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7bb7cd0-7579-44b5-bac2-ae93c122858a-combined-ca-bundle\") pod \"keystone-db-sync-mndqc\" (UID: \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\") " pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.855979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjj6z\" (UniqueName: \"kubernetes.io/projected/f7bb7cd0-7579-44b5-bac2-ae93c122858a-kube-api-access-kjj6z\") pod \"keystone-db-sync-mndqc\" (UID: \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\") " pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.934546 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5lxgd" Oct 01 13:24:32 crc kubenswrapper[4749]: I1001 13:24:32.966354 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:33 crc kubenswrapper[4749]: I1001 13:24:33.343322 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bce57e9e-7e46-4ac2-a709-e978a98e4575","Type":"ContainerStarted","Data":"a6f3daa1c0f587d33e4df7a64667922fc219b3811b5d08d4866b6f5616863a10"} Oct 01 13:24:35 crc kubenswrapper[4749]: I1001 13:24:35.893397 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:24:35 crc kubenswrapper[4749]: I1001 13:24:35.971172 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666cf554b5-49snv"] Oct 01 13:24:35 crc kubenswrapper[4749]: I1001 13:24:35.971504 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666cf554b5-49snv" podUID="cb8d09d5-0c47-4f41-b06f-8915f1dd0676" containerName="dnsmasq-dns" containerID="cri-o://25a6a0400009b888b328975d5bfb4b2469500bd325c7b8a664c2487cf5afe1da" gracePeriod=10 Oct 01 13:24:36 crc kubenswrapper[4749]: I1001 13:24:36.376937 4749 generic.go:334] "Generic (PLEG): container finished" podID="cb8d09d5-0c47-4f41-b06f-8915f1dd0676" containerID="25a6a0400009b888b328975d5bfb4b2469500bd325c7b8a664c2487cf5afe1da" exitCode=0 Oct 01 13:24:36 crc kubenswrapper[4749]: I1001 13:24:36.376978 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666cf554b5-49snv" event={"ID":"cb8d09d5-0c47-4f41-b06f-8915f1dd0676","Type":"ContainerDied","Data":"25a6a0400009b888b328975d5bfb4b2469500bd325c7b8a664c2487cf5afe1da"} Oct 01 13:24:40 crc kubenswrapper[4749]: I1001 13:24:40.462876 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666cf554b5-49snv" podUID="cb8d09d5-0c47-4f41-b06f-8915f1dd0676" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Oct 01 13:24:41 crc kubenswrapper[4749]: I1001 13:24:41.598025 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6vcqk"] Oct 01 13:24:41 crc kubenswrapper[4749]: I1001 13:24:41.619492 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5lxgd"] Oct 01 13:24:41 crc kubenswrapper[4749]: I1001 13:24:41.628935 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-8whfl"] Oct 01 13:24:41 crc kubenswrapper[4749]: W1001 13:24:41.629441 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf26e2be9_292a_4e1d_8ef4_98c9d9989cb0.slice/crio-1e042d03558c1531eb358f8f91f480b7de01b734c68faff7daa73990ce45ac4f WatchSource:0}: Error finding container 1e042d03558c1531eb358f8f91f480b7de01b734c68faff7daa73990ce45ac4f: Status 404 returned error can't find the container with id 1e042d03558c1531eb358f8f91f480b7de01b734c68faff7daa73990ce45ac4f Oct 01 13:24:41 crc kubenswrapper[4749]: I1001 13:24:41.637525 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-rws79"] Oct 01 13:24:41 crc kubenswrapper[4749]: W1001 13:24:41.646556 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2694945_782c_44b4_9613_4b5adb24c52f.slice/crio-ab7e90d4d321633d46db77fca9defde7dfb34c689c7b9b7d7b9fc75638d0e172 WatchSource:0}: Error finding container ab7e90d4d321633d46db77fca9defde7dfb34c689c7b9b7d7b9fc75638d0e172: Status 404 returned error can't find the container with id ab7e90d4d321633d46db77fca9defde7dfb34c689c7b9b7d7b9fc75638d0e172 Oct 01 13:24:41 crc kubenswrapper[4749]: E1001 13:24:41.694184 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Oct 01 13:24:41 crc kubenswrapper[4749]: E1001 13:24:41.694257 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Oct 01 13:24:41 crc kubenswrapper[4749]: E1001 13:24:41.694376 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:38.102.83.30:5001/podified-master-centos10/openstack-glance-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p8l5g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-d55n6_openstack(7bcae573-48fa-4920-8b8f-4df57d4c5375): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:24:41 crc kubenswrapper[4749]: E1001 13:24:41.695501 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-d55n6" podUID="7bcae573-48fa-4920-8b8f-4df57d4c5375" Oct 01 13:24:41 crc kubenswrapper[4749]: I1001 13:24:41.834749 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-mndqc"] Oct 01 13:24:41 crc kubenswrapper[4749]: W1001 13:24:41.864320 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7bb7cd0_7579_44b5_bac2_ae93c122858a.slice/crio-f535a3baa560b103995262d53df6b66e29b9da6e57224714ee2499adc28f95b9 WatchSource:0}: Error finding container f535a3baa560b103995262d53df6b66e29b9da6e57224714ee2499adc28f95b9: Status 404 returned error can't find the container with id f535a3baa560b103995262d53df6b66e29b9da6e57224714ee2499adc28f95b9 Oct 01 13:24:41 crc kubenswrapper[4749]: I1001 13:24:41.933452 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.020650 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c686n\" (UniqueName: \"kubernetes.io/projected/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-kube-api-access-c686n\") pod \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.020701 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-ovsdbserver-nb\") pod \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.020789 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-config\") pod \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.021458 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-ovsdbserver-sb\") pod \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.021490 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-dns-svc\") pod \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\" (UID: \"cb8d09d5-0c47-4f41-b06f-8915f1dd0676\") " Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.036693 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-kube-api-access-c686n" (OuterVolumeSpecName: "kube-api-access-c686n") pod "cb8d09d5-0c47-4f41-b06f-8915f1dd0676" (UID: "cb8d09d5-0c47-4f41-b06f-8915f1dd0676"). InnerVolumeSpecName "kube-api-access-c686n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.062873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cb8d09d5-0c47-4f41-b06f-8915f1dd0676" (UID: "cb8d09d5-0c47-4f41-b06f-8915f1dd0676"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.066871 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cb8d09d5-0c47-4f41-b06f-8915f1dd0676" (UID: "cb8d09d5-0c47-4f41-b06f-8915f1dd0676"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.069713 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb8d09d5-0c47-4f41-b06f-8915f1dd0676" (UID: "cb8d09d5-0c47-4f41-b06f-8915f1dd0676"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.070886 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-config" (OuterVolumeSpecName: "config") pod "cb8d09d5-0c47-4f41-b06f-8915f1dd0676" (UID: "cb8d09d5-0c47-4f41-b06f-8915f1dd0676"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.123410 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.123575 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.123708 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c686n\" (UniqueName: \"kubernetes.io/projected/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-kube-api-access-c686n\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.123840 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.123923 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb8d09d5-0c47-4f41-b06f-8915f1dd0676-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.432920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666cf554b5-49snv" event={"ID":"cb8d09d5-0c47-4f41-b06f-8915f1dd0676","Type":"ContainerDied","Data":"00221eafe79c6e1559b94d85f5ae71d09cd5241ff2901be48eced96abdb516dd"} Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.433249 4749 scope.go:117] "RemoveContainer" containerID="25a6a0400009b888b328975d5bfb4b2469500bd325c7b8a664c2487cf5afe1da" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.433130 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666cf554b5-49snv" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.435136 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8whfl" event={"ID":"9896d0f7-92a6-46b0-88ce-b64b390998c5","Type":"ContainerStarted","Data":"ec7ce3bba8ed31fe614588f7fd3a3d3beacf74b022bf240db55e9c6906b89d02"} Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.439766 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bce57e9e-7e46-4ac2-a709-e978a98e4575","Type":"ContainerStarted","Data":"b42e631344599e34e1e9b58969e05d024e40b0600037bad7b8df7b92d726f929"} Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.444351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mndqc" event={"ID":"f7bb7cd0-7579-44b5-bac2-ae93c122858a","Type":"ContainerStarted","Data":"f535a3baa560b103995262d53df6b66e29b9da6e57224714ee2499adc28f95b9"} Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.447449 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"f591f132880451f6e2a795c1ad995a4e9513c1a6eef56b3898e6a9f77eb8baef"} Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.449635 4749 generic.go:334] "Generic (PLEG): container finished" podID="f26e2be9-292a-4e1d-8ef4-98c9d9989cb0" containerID="29cf94d9da7b55be344b02f852233b488d6adacf32b0baf10734190b34ad340d" exitCode=0 Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.449754 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5lxgd" event={"ID":"f26e2be9-292a-4e1d-8ef4-98c9d9989cb0","Type":"ContainerDied","Data":"29cf94d9da7b55be344b02f852233b488d6adacf32b0baf10734190b34ad340d"} Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.449775 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5lxgd" event={"ID":"f26e2be9-292a-4e1d-8ef4-98c9d9989cb0","Type":"ContainerStarted","Data":"1e042d03558c1531eb358f8f91f480b7de01b734c68faff7daa73990ce45ac4f"} Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.451598 4749 generic.go:334] "Generic (PLEG): container finished" podID="a2694945-782c-44b4-9613-4b5adb24c52f" containerID="6720f9d321b381c7d988d12b9c1803ea121752f54b6405c028a939b068040d33" exitCode=0 Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.451685 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rws79" event={"ID":"a2694945-782c-44b4-9613-4b5adb24c52f","Type":"ContainerDied","Data":"6720f9d321b381c7d988d12b9c1803ea121752f54b6405c028a939b068040d33"} Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.451728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rws79" event={"ID":"a2694945-782c-44b4-9613-4b5adb24c52f","Type":"ContainerStarted","Data":"ab7e90d4d321633d46db77fca9defde7dfb34c689c7b9b7d7b9fc75638d0e172"} Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.453524 4749 generic.go:334] "Generic (PLEG): container finished" podID="13a817a5-ab51-4aa9-a26d-81f451d600d1" containerID="e934e7609fb2cefb489ea65c5f3c9dfab067f34d820cb1682e52b496f37f6621" exitCode=0 Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.454258 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6vcqk" event={"ID":"13a817a5-ab51-4aa9-a26d-81f451d600d1","Type":"ContainerDied","Data":"e934e7609fb2cefb489ea65c5f3c9dfab067f34d820cb1682e52b496f37f6621"} Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.454287 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6vcqk" event={"ID":"13a817a5-ab51-4aa9-a26d-81f451d600d1","Type":"ContainerStarted","Data":"1cfacf64628d355e3e7da3e017e59043ce4cea0add064f037c43abb903b37524"} Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.472382 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.472366404 podStartE2EDuration="22.472366404s" podCreationTimestamp="2025-10-01 13:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:24:42.46343054 +0000 UTC m=+1142.517415449" watchObservedRunningTime="2025-10-01 13:24:42.472366404 +0000 UTC m=+1142.526351293" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.472812 4749 scope.go:117] "RemoveContainer" containerID="7ccb01cd0f2ac61467ac856afe5e6481da5b3608d438a6bed589faf49ab54e8e" Oct 01 13:24:42 crc kubenswrapper[4749]: E1001 13:24:42.473110 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.30:5001/podified-master-centos10/openstack-glance-api:watcher_latest\\\"\"" pod="openstack/glance-db-sync-d55n6" podUID="7bcae573-48fa-4920-8b8f-4df57d4c5375" Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.552482 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666cf554b5-49snv"] Oct 01 13:24:42 crc kubenswrapper[4749]: I1001 13:24:42.560076 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666cf554b5-49snv"] Oct 01 13:24:43 crc kubenswrapper[4749]: I1001 13:24:43.241199 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb8d09d5-0c47-4f41-b06f-8915f1dd0676" path="/var/lib/kubelet/pods/cb8d09d5-0c47-4f41-b06f-8915f1dd0676/volumes" Oct 01 13:24:45 crc kubenswrapper[4749]: I1001 13:24:45.635484 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5lxgd" Oct 01 13:24:45 crc kubenswrapper[4749]: I1001 13:24:45.815043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr4l7\" (UniqueName: \"kubernetes.io/projected/f26e2be9-292a-4e1d-8ef4-98c9d9989cb0-kube-api-access-sr4l7\") pod \"f26e2be9-292a-4e1d-8ef4-98c9d9989cb0\" (UID: \"f26e2be9-292a-4e1d-8ef4-98c9d9989cb0\") " Oct 01 13:24:45 crc kubenswrapper[4749]: I1001 13:24:45.823476 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26e2be9-292a-4e1d-8ef4-98c9d9989cb0-kube-api-access-sr4l7" (OuterVolumeSpecName: "kube-api-access-sr4l7") pod "f26e2be9-292a-4e1d-8ef4-98c9d9989cb0" (UID: "f26e2be9-292a-4e1d-8ef4-98c9d9989cb0"). InnerVolumeSpecName "kube-api-access-sr4l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:45 crc kubenswrapper[4749]: I1001 13:24:45.861858 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:45 crc kubenswrapper[4749]: I1001 13:24:45.916868 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr4l7\" (UniqueName: \"kubernetes.io/projected/f26e2be9-292a-4e1d-8ef4-98c9d9989cb0-kube-api-access-sr4l7\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:46 crc kubenswrapper[4749]: I1001 13:24:46.498506 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5lxgd" event={"ID":"f26e2be9-292a-4e1d-8ef4-98c9d9989cb0","Type":"ContainerDied","Data":"1e042d03558c1531eb358f8f91f480b7de01b734c68faff7daa73990ce45ac4f"} Oct 01 13:24:46 crc kubenswrapper[4749]: I1001 13:24:46.498557 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e042d03558c1531eb358f8f91f480b7de01b734c68faff7daa73990ce45ac4f" Oct 01 13:24:46 crc kubenswrapper[4749]: I1001 13:24:46.498567 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5lxgd" Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.130017 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6vcqk" Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.273332 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9c2h\" (UniqueName: \"kubernetes.io/projected/13a817a5-ab51-4aa9-a26d-81f451d600d1-kube-api-access-q9c2h\") pod \"13a817a5-ab51-4aa9-a26d-81f451d600d1\" (UID: \"13a817a5-ab51-4aa9-a26d-81f451d600d1\") " Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.282103 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a817a5-ab51-4aa9-a26d-81f451d600d1-kube-api-access-q9c2h" (OuterVolumeSpecName: "kube-api-access-q9c2h") pod "13a817a5-ab51-4aa9-a26d-81f451d600d1" (UID: "13a817a5-ab51-4aa9-a26d-81f451d600d1"). InnerVolumeSpecName "kube-api-access-q9c2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.375946 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9c2h\" (UniqueName: \"kubernetes.io/projected/13a817a5-ab51-4aa9-a26d-81f451d600d1-kube-api-access-q9c2h\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.526665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6vcqk" event={"ID":"13a817a5-ab51-4aa9-a26d-81f451d600d1","Type":"ContainerDied","Data":"1cfacf64628d355e3e7da3e017e59043ce4cea0add064f037c43abb903b37524"} Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.526721 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cfacf64628d355e3e7da3e017e59043ce4cea0add064f037c43abb903b37524" Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.526689 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6vcqk" Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.528292 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-rws79" event={"ID":"a2694945-782c-44b4-9613-4b5adb24c52f","Type":"ContainerDied","Data":"ab7e90d4d321633d46db77fca9defde7dfb34c689c7b9b7d7b9fc75638d0e172"} Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.528323 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab7e90d4d321633d46db77fca9defde7dfb34c689c7b9b7d7b9fc75638d0e172" Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.591276 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rws79" Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.781669 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swr2t\" (UniqueName: \"kubernetes.io/projected/a2694945-782c-44b4-9613-4b5adb24c52f-kube-api-access-swr2t\") pod \"a2694945-782c-44b4-9613-4b5adb24c52f\" (UID: \"a2694945-782c-44b4-9613-4b5adb24c52f\") " Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.785496 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2694945-782c-44b4-9613-4b5adb24c52f-kube-api-access-swr2t" (OuterVolumeSpecName: "kube-api-access-swr2t") pod "a2694945-782c-44b4-9613-4b5adb24c52f" (UID: "a2694945-782c-44b4-9613-4b5adb24c52f"). InnerVolumeSpecName "kube-api-access-swr2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:49 crc kubenswrapper[4749]: I1001 13:24:49.884142 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swr2t\" (UniqueName: \"kubernetes.io/projected/a2694945-782c-44b4-9613-4b5adb24c52f-kube-api-access-swr2t\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:50 crc kubenswrapper[4749]: I1001 13:24:50.545504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8whfl" event={"ID":"9896d0f7-92a6-46b0-88ce-b64b390998c5","Type":"ContainerStarted","Data":"3f319d3ac0ef0dbe05163254438f315aac780f6ff7d92346c4031e079894a432"} Oct 01 13:24:50 crc kubenswrapper[4749]: I1001 13:24:50.549050 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-rws79" Oct 01 13:24:50 crc kubenswrapper[4749]: I1001 13:24:50.549055 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mndqc" event={"ID":"f7bb7cd0-7579-44b5-bac2-ae93c122858a","Type":"ContainerStarted","Data":"4bfdecd9965f22ae3d86a86b872d4d7a032dbdc8dc61e50b6185755a7540ca09"} Oct 01 13:24:50 crc kubenswrapper[4749]: I1001 13:24:50.588260 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-8whfl" podStartSLOduration=10.611011824 podStartE2EDuration="18.588210727s" podCreationTimestamp="2025-10-01 13:24:32 +0000 UTC" firstStartedPulling="2025-10-01 13:24:41.62214163 +0000 UTC m=+1141.676126539" lastFinishedPulling="2025-10-01 13:24:49.599340533 +0000 UTC m=+1149.653325442" observedRunningTime="2025-10-01 13:24:50.567174792 +0000 UTC m=+1150.621159761" watchObservedRunningTime="2025-10-01 13:24:50.588210727 +0000 UTC m=+1150.642195656" Oct 01 13:24:50 crc kubenswrapper[4749]: I1001 13:24:50.603093 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-mndqc" podStartSLOduration=10.908433981 podStartE2EDuration="18.603062463s" podCreationTimestamp="2025-10-01 13:24:32 +0000 UTC" firstStartedPulling="2025-10-01 13:24:41.866816955 +0000 UTC m=+1141.920801854" lastFinishedPulling="2025-10-01 13:24:49.561445437 +0000 UTC m=+1149.615430336" observedRunningTime="2025-10-01 13:24:50.597524612 +0000 UTC m=+1150.651509551" watchObservedRunningTime="2025-10-01 13:24:50.603062463 +0000 UTC m=+1150.657047402" Oct 01 13:24:50 crc kubenswrapper[4749]: I1001 13:24:50.862300 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:50 crc kubenswrapper[4749]: I1001 13:24:50.872047 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:51 crc kubenswrapper[4749]: I1001 13:24:51.579992 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.422845 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ff7d-account-create-wcnjd"] Oct 01 13:24:52 crc kubenswrapper[4749]: E1001 13:24:52.423455 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a817a5-ab51-4aa9-a26d-81f451d600d1" containerName="mariadb-database-create" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.423484 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a817a5-ab51-4aa9-a26d-81f451d600d1" containerName="mariadb-database-create" Oct 01 13:24:52 crc kubenswrapper[4749]: E1001 13:24:52.423524 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb8d09d5-0c47-4f41-b06f-8915f1dd0676" containerName="init" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.423538 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8d09d5-0c47-4f41-b06f-8915f1dd0676" containerName="init" Oct 01 13:24:52 crc kubenswrapper[4749]: E1001 13:24:52.423558 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26e2be9-292a-4e1d-8ef4-98c9d9989cb0" containerName="mariadb-database-create" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.423570 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26e2be9-292a-4e1d-8ef4-98c9d9989cb0" containerName="mariadb-database-create" Oct 01 13:24:52 crc kubenswrapper[4749]: E1001 13:24:52.423600 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2694945-782c-44b4-9613-4b5adb24c52f" containerName="mariadb-database-create" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.423612 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2694945-782c-44b4-9613-4b5adb24c52f" containerName="mariadb-database-create" Oct 01 13:24:52 crc kubenswrapper[4749]: E1001 13:24:52.423635 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb8d09d5-0c47-4f41-b06f-8915f1dd0676" containerName="dnsmasq-dns" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.423647 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb8d09d5-0c47-4f41-b06f-8915f1dd0676" containerName="dnsmasq-dns" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.423965 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a817a5-ab51-4aa9-a26d-81f451d600d1" containerName="mariadb-database-create" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.424012 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2694945-782c-44b4-9613-4b5adb24c52f" containerName="mariadb-database-create" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.424027 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb8d09d5-0c47-4f41-b06f-8915f1dd0676" containerName="dnsmasq-dns" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.424055 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26e2be9-292a-4e1d-8ef4-98c9d9989cb0" containerName="mariadb-database-create" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.425033 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff7d-account-create-wcnjd" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.427425 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.439124 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ff7d-account-create-wcnjd"] Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.538125 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfxqc\" (UniqueName: \"kubernetes.io/projected/8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2-kube-api-access-sfxqc\") pod \"neutron-ff7d-account-create-wcnjd\" (UID: \"8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2\") " pod="openstack/neutron-ff7d-account-create-wcnjd" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.643197 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfxqc\" (UniqueName: \"kubernetes.io/projected/8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2-kube-api-access-sfxqc\") pod \"neutron-ff7d-account-create-wcnjd\" (UID: \"8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2\") " pod="openstack/neutron-ff7d-account-create-wcnjd" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.664405 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfxqc\" (UniqueName: \"kubernetes.io/projected/8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2-kube-api-access-sfxqc\") pod \"neutron-ff7d-account-create-wcnjd\" (UID: \"8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2\") " pod="openstack/neutron-ff7d-account-create-wcnjd" Oct 01 13:24:52 crc kubenswrapper[4749]: I1001 13:24:52.779046 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff7d-account-create-wcnjd" Oct 01 13:24:53 crc kubenswrapper[4749]: I1001 13:24:53.259247 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ff7d-account-create-wcnjd"] Oct 01 13:24:53 crc kubenswrapper[4749]: I1001 13:24:53.584266 4749 generic.go:334] "Generic (PLEG): container finished" podID="8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2" containerID="3d7ccdae9c6117608e6a377d7acb6b411bcb8eefe49a5cd79e68070783576383" exitCode=0 Oct 01 13:24:53 crc kubenswrapper[4749]: I1001 13:24:53.584392 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff7d-account-create-wcnjd" event={"ID":"8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2","Type":"ContainerDied","Data":"3d7ccdae9c6117608e6a377d7acb6b411bcb8eefe49a5cd79e68070783576383"} Oct 01 13:24:53 crc kubenswrapper[4749]: I1001 13:24:53.584428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff7d-account-create-wcnjd" event={"ID":"8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2","Type":"ContainerStarted","Data":"2da4cbfc85a7ceca3a62e77223406d7eb070036f6031aed379952da3b0d3c357"} Oct 01 13:24:53 crc kubenswrapper[4749]: I1001 13:24:53.589160 4749 generic.go:334] "Generic (PLEG): container finished" podID="9896d0f7-92a6-46b0-88ce-b64b390998c5" containerID="3f319d3ac0ef0dbe05163254438f315aac780f6ff7d92346c4031e079894a432" exitCode=0 Oct 01 13:24:53 crc kubenswrapper[4749]: I1001 13:24:53.589235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8whfl" event={"ID":"9896d0f7-92a6-46b0-88ce-b64b390998c5","Type":"ContainerDied","Data":"3f319d3ac0ef0dbe05163254438f315aac780f6ff7d92346c4031e079894a432"} Oct 01 13:24:54 crc kubenswrapper[4749]: I1001 13:24:54.605383 4749 generic.go:334] "Generic (PLEG): container finished" podID="f7bb7cd0-7579-44b5-bac2-ae93c122858a" containerID="4bfdecd9965f22ae3d86a86b872d4d7a032dbdc8dc61e50b6185755a7540ca09" exitCode=0 Oct 01 13:24:54 crc kubenswrapper[4749]: I1001 13:24:54.605474 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mndqc" event={"ID":"f7bb7cd0-7579-44b5-bac2-ae93c122858a","Type":"ContainerDied","Data":"4bfdecd9965f22ae3d86a86b872d4d7a032dbdc8dc61e50b6185755a7540ca09"} Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.153655 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff7d-account-create-wcnjd" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.161495 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.291150 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-db-sync-config-data\") pod \"9896d0f7-92a6-46b0-88ce-b64b390998c5\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.291283 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfxqc\" (UniqueName: \"kubernetes.io/projected/8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2-kube-api-access-sfxqc\") pod \"8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2\" (UID: \"8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2\") " Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.291357 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95lvb\" (UniqueName: \"kubernetes.io/projected/9896d0f7-92a6-46b0-88ce-b64b390998c5-kube-api-access-95lvb\") pod \"9896d0f7-92a6-46b0-88ce-b64b390998c5\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.291412 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-config-data\") pod \"9896d0f7-92a6-46b0-88ce-b64b390998c5\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.291470 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-combined-ca-bundle\") pod \"9896d0f7-92a6-46b0-88ce-b64b390998c5\" (UID: \"9896d0f7-92a6-46b0-88ce-b64b390998c5\") " Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.296747 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9896d0f7-92a6-46b0-88ce-b64b390998c5" (UID: "9896d0f7-92a6-46b0-88ce-b64b390998c5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.297119 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9896d0f7-92a6-46b0-88ce-b64b390998c5-kube-api-access-95lvb" (OuterVolumeSpecName: "kube-api-access-95lvb") pod "9896d0f7-92a6-46b0-88ce-b64b390998c5" (UID: "9896d0f7-92a6-46b0-88ce-b64b390998c5"). InnerVolumeSpecName "kube-api-access-95lvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.297166 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2-kube-api-access-sfxqc" (OuterVolumeSpecName: "kube-api-access-sfxqc") pod "8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2" (UID: "8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2"). InnerVolumeSpecName "kube-api-access-sfxqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.314405 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9896d0f7-92a6-46b0-88ce-b64b390998c5" (UID: "9896d0f7-92a6-46b0-88ce-b64b390998c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.340208 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-config-data" (OuterVolumeSpecName: "config-data") pod "9896d0f7-92a6-46b0-88ce-b64b390998c5" (UID: "9896d0f7-92a6-46b0-88ce-b64b390998c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.393461 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.393707 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfxqc\" (UniqueName: \"kubernetes.io/projected/8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2-kube-api-access-sfxqc\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.393777 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95lvb\" (UniqueName: \"kubernetes.io/projected/9896d0f7-92a6-46b0-88ce-b64b390998c5-kube-api-access-95lvb\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.393800 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.393812 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9896d0f7-92a6-46b0-88ce-b64b390998c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.622708 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff7d-account-create-wcnjd" event={"ID":"8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2","Type":"ContainerDied","Data":"2da4cbfc85a7ceca3a62e77223406d7eb070036f6031aed379952da3b0d3c357"} Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.622842 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2da4cbfc85a7ceca3a62e77223406d7eb070036f6031aed379952da3b0d3c357" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.622767 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff7d-account-create-wcnjd" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.629755 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-8whfl" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.629889 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-8whfl" event={"ID":"9896d0f7-92a6-46b0-88ce-b64b390998c5","Type":"ContainerDied","Data":"ec7ce3bba8ed31fe614588f7fd3a3d3beacf74b022bf240db55e9c6906b89d02"} Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.629939 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec7ce3bba8ed31fe614588f7fd3a3d3beacf74b022bf240db55e9c6906b89d02" Oct 01 13:24:55 crc kubenswrapper[4749]: I1001 13:24:55.959265 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.113842 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjj6z\" (UniqueName: \"kubernetes.io/projected/f7bb7cd0-7579-44b5-bac2-ae93c122858a-kube-api-access-kjj6z\") pod \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\" (UID: \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\") " Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.113957 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7bb7cd0-7579-44b5-bac2-ae93c122858a-combined-ca-bundle\") pod \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\" (UID: \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\") " Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.114010 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7bb7cd0-7579-44b5-bac2-ae93c122858a-config-data\") pod \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\" (UID: \"f7bb7cd0-7579-44b5-bac2-ae93c122858a\") " Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.117423 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7bb7cd0-7579-44b5-bac2-ae93c122858a-kube-api-access-kjj6z" (OuterVolumeSpecName: "kube-api-access-kjj6z") pod "f7bb7cd0-7579-44b5-bac2-ae93c122858a" (UID: "f7bb7cd0-7579-44b5-bac2-ae93c122858a"). InnerVolumeSpecName "kube-api-access-kjj6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.148405 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7bb7cd0-7579-44b5-bac2-ae93c122858a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7bb7cd0-7579-44b5-bac2-ae93c122858a" (UID: "f7bb7cd0-7579-44b5-bac2-ae93c122858a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.160769 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7bb7cd0-7579-44b5-bac2-ae93c122858a-config-data" (OuterVolumeSpecName: "config-data") pod "f7bb7cd0-7579-44b5-bac2-ae93c122858a" (UID: "f7bb7cd0-7579-44b5-bac2-ae93c122858a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.215541 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjj6z\" (UniqueName: \"kubernetes.io/projected/f7bb7cd0-7579-44b5-bac2-ae93c122858a-kube-api-access-kjj6z\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.215583 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7bb7cd0-7579-44b5-bac2-ae93c122858a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.215597 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7bb7cd0-7579-44b5-bac2-ae93c122858a-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.643816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-mndqc" event={"ID":"f7bb7cd0-7579-44b5-bac2-ae93c122858a","Type":"ContainerDied","Data":"f535a3baa560b103995262d53df6b66e29b9da6e57224714ee2499adc28f95b9"} Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.644201 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f535a3baa560b103995262d53df6b66e29b9da6e57224714ee2499adc28f95b9" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.643887 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-mndqc" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.918088 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d4bc6d655-p268f"] Oct 01 13:24:56 crc kubenswrapper[4749]: E1001 13:24:56.920624 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2" containerName="mariadb-account-create" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.920666 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2" containerName="mariadb-account-create" Oct 01 13:24:56 crc kubenswrapper[4749]: E1001 13:24:56.920684 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7bb7cd0-7579-44b5-bac2-ae93c122858a" containerName="keystone-db-sync" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.920690 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7bb7cd0-7579-44b5-bac2-ae93c122858a" containerName="keystone-db-sync" Oct 01 13:24:56 crc kubenswrapper[4749]: E1001 13:24:56.920706 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9896d0f7-92a6-46b0-88ce-b64b390998c5" containerName="watcher-db-sync" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.920712 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9896d0f7-92a6-46b0-88ce-b64b390998c5" containerName="watcher-db-sync" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.920944 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2" containerName="mariadb-account-create" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.920967 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9896d0f7-92a6-46b0-88ce-b64b390998c5" containerName="watcher-db-sync" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.920975 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7bb7cd0-7579-44b5-bac2-ae93c122858a" containerName="keystone-db-sync" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.921865 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.927784 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d4bc6d655-p268f"] Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.949243 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fz544"] Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.953970 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.960151 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.960618 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.962714 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-776bb" Oct 01 13:24:56 crc kubenswrapper[4749]: I1001 13:24:56.969409 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.000027 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fz544"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.027146 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-config\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.027195 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-dns-svc\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.027252 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-dns-swift-storage-0\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.027312 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-ovsdbserver-sb\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.027415 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5rvx\" (UniqueName: \"kubernetes.io/projected/617c6727-b523-4976-bf75-19c87ff896ce-kube-api-access-n5rvx\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.027444 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-ovsdbserver-nb\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.031302 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.032603 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.039609 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.039783 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-nk64q" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.057111 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.085669 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-59bbdc7665-bcszc"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.086957 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.091208 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.091396 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.091397 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mhkl5" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.091473 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.116251 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59bbdc7665-bcszc"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128617 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-scripts\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128666 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-credential-keys\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128700 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-ovsdbserver-sb\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128725 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd95f834-908d-4a7c-8552-b93425ae5dd8-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128771 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd95f834-908d-4a7c-8552-b93425ae5dd8-config-data\") pod \"watcher-applier-0\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128790 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ct5n\" (UniqueName: \"kubernetes.io/projected/dd95f834-908d-4a7c-8552-b93425ae5dd8-kube-api-access-8ct5n\") pod \"watcher-applier-0\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128816 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-combined-ca-bundle\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128842 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2qg6\" (UniqueName: \"kubernetes.io/projected/57aa7c22-d447-48f7-b16b-517c8553dc09-kube-api-access-j2qg6\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128864 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd95f834-908d-4a7c-8552-b93425ae5dd8-logs\") pod \"watcher-applier-0\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128884 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-config-data\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128900 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5rvx\" (UniqueName: \"kubernetes.io/projected/617c6727-b523-4976-bf75-19c87ff896ce-kube-api-access-n5rvx\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128914 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-fernet-keys\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128932 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-ovsdbserver-nb\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-dns-svc\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.128979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-config\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.129011 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-dns-swift-storage-0\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.129935 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-dns-swift-storage-0\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.130486 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-ovsdbserver-sb\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.131276 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-ovsdbserver-nb\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.131750 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-dns-svc\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.132261 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-config\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.132294 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.133309 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.139539 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.159616 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.168103 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.169797 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5rvx\" (UniqueName: \"kubernetes.io/projected/617c6727-b523-4976-bf75-19c87ff896ce-kube-api-access-n5rvx\") pod \"dnsmasq-dns-7d4bc6d655-p268f\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.181302 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.184888 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.216744 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-867ff5dd5c-2lf7x"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.238896 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243024 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-config-data\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-fernet-keys\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243118 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243206 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p55lr\" (UniqueName: \"kubernetes.io/projected/fab3f7e4-b73c-4e23-839f-719e5b8ca205-kube-api-access-p55lr\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-horizon-secret-key\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243338 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-scripts\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243391 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-credential-keys\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-config-data\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243499 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd95f834-908d-4a7c-8552-b93425ae5dd8-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243600 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd95f834-908d-4a7c-8552-b93425ae5dd8-config-data\") pod \"watcher-applier-0\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243637 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ct5n\" (UniqueName: \"kubernetes.io/projected/dd95f834-908d-4a7c-8552-b93425ae5dd8-kube-api-access-8ct5n\") pod \"watcher-applier-0\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243662 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-scripts\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243685 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fab3f7e4-b73c-4e23-839f-719e5b8ca205-logs\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243721 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-combined-ca-bundle\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243793 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2qg6\" (UniqueName: \"kubernetes.io/projected/57aa7c22-d447-48f7-b16b-517c8553dc09-kube-api-access-j2qg6\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243827 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd95f834-908d-4a7c-8552-b93425ae5dd8-logs\") pod \"watcher-applier-0\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243853 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qqlt\" (UniqueName: \"kubernetes.io/projected/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-kube-api-access-6qqlt\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243877 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-config-data\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.243901 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-logs\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.263210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd95f834-908d-4a7c-8552-b93425ae5dd8-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.271928 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd95f834-908d-4a7c-8552-b93425ae5dd8-logs\") pod \"watcher-applier-0\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.271987 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.272604 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-config-data\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.273522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-fernet-keys\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.273902 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-combined-ca-bundle\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.275177 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-credential-keys\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.275485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-scripts\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.278968 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd95f834-908d-4a7c-8552-b93425ae5dd8-config-data\") pod \"watcher-applier-0\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.367000 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ct5n\" (UniqueName: \"kubernetes.io/projected/dd95f834-908d-4a7c-8552-b93425ae5dd8-kube-api-access-8ct5n\") pod \"watcher-applier-0\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372267 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fab3f7e4-b73c-4e23-839f-719e5b8ca205-logs\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372310 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f86cj\" (UniqueName: \"kubernetes.io/projected/21d4a7bd-568e-4b4f-854a-ec3963262172-kube-api-access-f86cj\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372349 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-logs\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qqlt\" (UniqueName: \"kubernetes.io/projected/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-kube-api-access-6qqlt\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372440 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-config-data\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372456 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-logs\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372477 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21d4a7bd-568e-4b4f-854a-ec3963262172-config-data\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-config-data\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372554 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p55lr\" (UniqueName: \"kubernetes.io/projected/fab3f7e4-b73c-4e23-839f-719e5b8ca205-kube-api-access-p55lr\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372585 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-horizon-secret-key\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372623 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21d4a7bd-568e-4b4f-854a-ec3963262172-scripts\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372642 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prs8b\" (UniqueName: \"kubernetes.io/projected/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-kube-api-access-prs8b\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372687 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372706 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-config-data\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372746 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372781 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/21d4a7bd-568e-4b4f-854a-ec3963262172-horizon-secret-key\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372803 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21d4a7bd-568e-4b4f-854a-ec3963262172-logs\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.372844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-scripts\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.373556 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2qg6\" (UniqueName: \"kubernetes.io/projected/57aa7c22-d447-48f7-b16b-517c8553dc09-kube-api-access-j2qg6\") pod \"keystone-bootstrap-fz544\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.375646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-config-data\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.376102 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-logs\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.377401 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fab3f7e4-b73c-4e23-839f-719e5b8ca205-logs\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.379429 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-scripts\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.380035 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.382068 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-config-data\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.382687 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.386565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.397447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-horizon-secret-key\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.400129 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-867ff5dd5c-2lf7x"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.409431 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qqlt\" (UniqueName: \"kubernetes.io/projected/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-kube-api-access-6qqlt\") pod \"horizon-59bbdc7665-bcszc\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.411627 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.411854 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p55lr\" (UniqueName: \"kubernetes.io/projected/fab3f7e4-b73c-4e23-839f-719e5b8ca205-kube-api-access-p55lr\") pod \"watcher-decision-engine-0\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.432598 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.436396 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.444085 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.444142 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.444205 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.454553 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.465333 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-5j2w8"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.466499 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.468209 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.468395 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.468612 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hp8k2" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.474346 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21d4a7bd-568e-4b4f-854a-ec3963262172-logs\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.474401 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f86cj\" (UniqueName: \"kubernetes.io/projected/21d4a7bd-568e-4b4f-854a-ec3963262172-kube-api-access-f86cj\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.474430 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-logs\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.474470 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21d4a7bd-568e-4b4f-854a-ec3963262172-config-data\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.474487 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-config-data\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.474535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21d4a7bd-568e-4b4f-854a-ec3963262172-scripts\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.474553 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prs8b\" (UniqueName: \"kubernetes.io/projected/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-kube-api-access-prs8b\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.474584 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.474618 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.474645 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/21d4a7bd-568e-4b4f-854a-ec3963262172-horizon-secret-key\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.475785 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21d4a7bd-568e-4b4f-854a-ec3963262172-scripts\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.475933 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-5j2w8"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.476472 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-logs\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.476498 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21d4a7bd-568e-4b4f-854a-ec3963262172-logs\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.486934 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.488199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-config-data\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.491280 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d4bc6d655-p268f"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.494149 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prs8b\" (UniqueName: \"kubernetes.io/projected/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-kube-api-access-prs8b\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.496021 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f7f865789-9mjp6"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.497496 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.497939 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21d4a7bd-568e-4b4f-854a-ec3963262172-config-data\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.502739 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f7f865789-9mjp6"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.504007 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/21d4a7bd-568e-4b4f-854a-ec3963262172-horizon-secret-key\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.505925 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.506880 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f86cj\" (UniqueName: \"kubernetes.io/projected/21d4a7bd-568e-4b4f-854a-ec3963262172-kube-api-access-f86cj\") pod \"horizon-867ff5dd5c-2lf7x\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.539174 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.570993 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fz544" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.576148 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qvcs\" (UniqueName: \"kubernetes.io/projected/24e52665-f55b-4137-82ec-7ab7392bca61-kube-api-access-8qvcs\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.576198 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-config-data\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.576558 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ea02603-ed44-4faa-ae1e-37cf61162fde-logs\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.576606 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-config-data\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.576632 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-combined-ca-bundle\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.576656 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8pmv\" (UniqueName: \"kubernetes.io/projected/9ea02603-ed44-4faa-ae1e-37cf61162fde-kube-api-access-k8pmv\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.576673 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.576692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-scripts\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.576730 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-scripts\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.576756 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e52665-f55b-4137-82ec-7ab7392bca61-log-httpd\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.576797 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.576812 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e52665-f55b-4137-82ec-7ab7392bca61-run-httpd\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.657160 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678060 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4278f\" (UniqueName: \"kubernetes.io/projected/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-kube-api-access-4278f\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-dns-swift-storage-0\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678128 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-config-data\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678143 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-ovsdbserver-nb\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678167 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ea02603-ed44-4faa-ae1e-37cf61162fde-logs\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678206 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-config-data\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-combined-ca-bundle\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678294 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-ovsdbserver-sb\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678315 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8pmv\" (UniqueName: \"kubernetes.io/projected/9ea02603-ed44-4faa-ae1e-37cf61162fde-kube-api-access-k8pmv\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678331 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678349 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-scripts\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678387 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-scripts\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678414 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e52665-f55b-4137-82ec-7ab7392bca61-log-httpd\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678433 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-config\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678460 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-dns-svc\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678478 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e52665-f55b-4137-82ec-7ab7392bca61-run-httpd\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.678528 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qvcs\" (UniqueName: \"kubernetes.io/projected/24e52665-f55b-4137-82ec-7ab7392bca61-kube-api-access-8qvcs\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.684576 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e52665-f55b-4137-82ec-7ab7392bca61-run-httpd\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.685058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ea02603-ed44-4faa-ae1e-37cf61162fde-logs\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.685949 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.688406 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-config-data\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.690141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-config-data\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.690360 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-scripts\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.691646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.692058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-scripts\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.693737 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-combined-ca-bundle\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.693963 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e52665-f55b-4137-82ec-7ab7392bca61-log-httpd\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.709772 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8pmv\" (UniqueName: \"kubernetes.io/projected/9ea02603-ed44-4faa-ae1e-37cf61162fde-kube-api-access-k8pmv\") pod \"placement-db-sync-5j2w8\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.710022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qvcs\" (UniqueName: \"kubernetes.io/projected/24e52665-f55b-4137-82ec-7ab7392bca61-kube-api-access-8qvcs\") pod \"ceilometer-0\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.714559 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-h459j"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.715597 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h459j" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.719809 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pxffj" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.719972 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.720116 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.738149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-h459j"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.755684 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.782713 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4278f\" (UniqueName: \"kubernetes.io/projected/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-kube-api-access-4278f\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.782759 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-dns-swift-storage-0\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.782781 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-ovsdbserver-nb\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.782838 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-ovsdbserver-sb\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.782901 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-config\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.782929 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-dns-svc\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.783765 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-dns-svc\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.784564 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-dns-swift-storage-0\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.785997 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-ovsdbserver-sb\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.786087 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-ovsdbserver-nb\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.787053 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-config\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.801655 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5j2w8" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.802244 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.806427 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4278f\" (UniqueName: \"kubernetes.io/projected/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-kube-api-access-4278f\") pod \"dnsmasq-dns-f7f865789-9mjp6\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.817627 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.827073 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d4bc6d655-p268f"] Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.905570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f535ba4-1d6d-4103-8764-c324341bffdd-combined-ca-bundle\") pod \"neutron-db-sync-h459j\" (UID: \"4f535ba4-1d6d-4103-8764-c324341bffdd\") " pod="openstack/neutron-db-sync-h459j" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.905620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzmd7\" (UniqueName: \"kubernetes.io/projected/4f535ba4-1d6d-4103-8764-c324341bffdd-kube-api-access-rzmd7\") pod \"neutron-db-sync-h459j\" (UID: \"4f535ba4-1d6d-4103-8764-c324341bffdd\") " pod="openstack/neutron-db-sync-h459j" Oct 01 13:24:57 crc kubenswrapper[4749]: I1001 13:24:57.905662 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f535ba4-1d6d-4103-8764-c324341bffdd-config\") pod \"neutron-db-sync-h459j\" (UID: \"4f535ba4-1d6d-4103-8764-c324341bffdd\") " pod="openstack/neutron-db-sync-h459j" Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.014488 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f535ba4-1d6d-4103-8764-c324341bffdd-combined-ca-bundle\") pod \"neutron-db-sync-h459j\" (UID: \"4f535ba4-1d6d-4103-8764-c324341bffdd\") " pod="openstack/neutron-db-sync-h459j" Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.014750 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzmd7\" (UniqueName: \"kubernetes.io/projected/4f535ba4-1d6d-4103-8764-c324341bffdd-kube-api-access-rzmd7\") pod \"neutron-db-sync-h459j\" (UID: \"4f535ba4-1d6d-4103-8764-c324341bffdd\") " pod="openstack/neutron-db-sync-h459j" Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.014785 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f535ba4-1d6d-4103-8764-c324341bffdd-config\") pod \"neutron-db-sync-h459j\" (UID: \"4f535ba4-1d6d-4103-8764-c324341bffdd\") " pod="openstack/neutron-db-sync-h459j" Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.031869 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59bbdc7665-bcszc"] Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.035410 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f535ba4-1d6d-4103-8764-c324341bffdd-config\") pod \"neutron-db-sync-h459j\" (UID: \"4f535ba4-1d6d-4103-8764-c324341bffdd\") " pod="openstack/neutron-db-sync-h459j" Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.046829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f535ba4-1d6d-4103-8764-c324341bffdd-combined-ca-bundle\") pod \"neutron-db-sync-h459j\" (UID: \"4f535ba4-1d6d-4103-8764-c324341bffdd\") " pod="openstack/neutron-db-sync-h459j" Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.058362 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzmd7\" (UniqueName: \"kubernetes.io/projected/4f535ba4-1d6d-4103-8764-c324341bffdd-kube-api-access-rzmd7\") pod \"neutron-db-sync-h459j\" (UID: \"4f535ba4-1d6d-4103-8764-c324341bffdd\") " pod="openstack/neutron-db-sync-h459j" Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.058721 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h459j" Oct 01 13:24:58 crc kubenswrapper[4749]: W1001 13:24:58.106134 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bf8736f_97c7_4f81_af11_2cab4dfff4b0.slice/crio-dbef047dd9d8febf3f7739bb6bd2b84a2420a15a93a3b7db2b51801b704a12b6 WatchSource:0}: Error finding container dbef047dd9d8febf3f7739bb6bd2b84a2420a15a93a3b7db2b51801b704a12b6: Status 404 returned error can't find the container with id dbef047dd9d8febf3f7739bb6bd2b84a2420a15a93a3b7db2b51801b704a12b6 Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.142260 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.161074 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:24:58 crc kubenswrapper[4749]: W1001 13:24:58.295871 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfab3f7e4_b73c_4e23_839f_719e5b8ca205.slice/crio-19e258b3a3042561a761627661d9140ad065447df9a175e2385e71b6291795c8 WatchSource:0}: Error finding container 19e258b3a3042561a761627661d9140ad065447df9a175e2385e71b6291795c8: Status 404 returned error can't find the container with id 19e258b3a3042561a761627661d9140ad065447df9a175e2385e71b6291795c8 Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.353889 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fz544"] Oct 01 13:24:58 crc kubenswrapper[4749]: W1001 13:24:58.424575 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57aa7c22_d447_48f7_b16b_517c8553dc09.slice/crio-ca628e459558d4b845c8bea941e85fc7f8c316fda189878bbc8bb048f09c2746 WatchSource:0}: Error finding container ca628e459558d4b845c8bea941e85fc7f8c316fda189878bbc8bb048f09c2746: Status 404 returned error can't find the container with id ca628e459558d4b845c8bea941e85fc7f8c316fda189878bbc8bb048f09c2746 Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.555227 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 01 13:24:58 crc kubenswrapper[4749]: W1001 13:24:58.584842 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd95f834_908d_4a7c_8552_b93425ae5dd8.slice/crio-b08840ab2a55da2ea2039f3b9f1460d477c0bfc283b0afc1e67d527c46c1dce1 WatchSource:0}: Error finding container b08840ab2a55da2ea2039f3b9f1460d477c0bfc283b0afc1e67d527c46c1dce1: Status 404 returned error can't find the container with id b08840ab2a55da2ea2039f3b9f1460d477c0bfc283b0afc1e67d527c46c1dce1 Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.665382 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-867ff5dd5c-2lf7x"] Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.677339 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fz544" event={"ID":"57aa7c22-d447-48f7-b16b-517c8553dc09","Type":"ContainerStarted","Data":"ca628e459558d4b845c8bea941e85fc7f8c316fda189878bbc8bb048f09c2746"} Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.678657 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"fab3f7e4-b73c-4e23-839f-719e5b8ca205","Type":"ContainerStarted","Data":"19e258b3a3042561a761627661d9140ad065447df9a175e2385e71b6291795c8"} Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.687562 4749 generic.go:334] "Generic (PLEG): container finished" podID="617c6727-b523-4976-bf75-19c87ff896ce" containerID="6162d6b9837ad548fad1358eab9505e53d450d8961255e3aa2ca4a3b05ccacf5" exitCode=0 Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.687637 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" event={"ID":"617c6727-b523-4976-bf75-19c87ff896ce","Type":"ContainerDied","Data":"6162d6b9837ad548fad1358eab9505e53d450d8961255e3aa2ca4a3b05ccacf5"} Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.687665 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" event={"ID":"617c6727-b523-4976-bf75-19c87ff896ce","Type":"ContainerStarted","Data":"8d5b44103355259be2d78d49fc58471d99956e6087e1c2d31c0ca323ccd09a78"} Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.693977 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59bbdc7665-bcszc" event={"ID":"0bf8736f-97c7-4f81-af11-2cab4dfff4b0","Type":"ContainerStarted","Data":"dbef047dd9d8febf3f7739bb6bd2b84a2420a15a93a3b7db2b51801b704a12b6"} Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.697824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"dd95f834-908d-4a7c-8552-b93425ae5dd8","Type":"ContainerStarted","Data":"b08840ab2a55da2ea2039f3b9f1460d477c0bfc283b0afc1e67d527c46c1dce1"} Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.700583 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b","Type":"ContainerStarted","Data":"0e9422034261039d929208334748b5eeee073307340b5829b59c177ef52b029c"} Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.700605 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b","Type":"ContainerStarted","Data":"a73a05589d4e2e5987792d3603abf69610dbdd5f9ae77851b2fd4fa4670a71f0"} Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.702112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867ff5dd5c-2lf7x" event={"ID":"21d4a7bd-568e-4b4f-854a-ec3963262172","Type":"ContainerStarted","Data":"19a9229bbd35a9fb02509a56615bc8f5abb556125be38e09eeab5b9f105f08f6"} Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.755450 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:24:58 crc kubenswrapper[4749]: W1001 13:24:58.761615 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24e52665_f55b_4137_82ec_7ab7392bca61.slice/crio-13d2c646ddbe1198028ba22d41a228313966c057b9ba0a84fda242c2d4fd0deb WatchSource:0}: Error finding container 13d2c646ddbe1198028ba22d41a228313966c057b9ba0a84fda242c2d4fd0deb: Status 404 returned error can't find the container with id 13d2c646ddbe1198028ba22d41a228313966c057b9ba0a84fda242c2d4fd0deb Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.775710 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f7f865789-9mjp6"] Oct 01 13:24:58 crc kubenswrapper[4749]: W1001 13:24:58.777727 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc79b5ea5_4a70_4868_b72a_bc8efb0cb967.slice/crio-83fcbcc736369e1a4f2afdc9eec350e8f2c7c2b324865725270b38bcb547c6f6 WatchSource:0}: Error finding container 83fcbcc736369e1a4f2afdc9eec350e8f2c7c2b324865725270b38bcb547c6f6: Status 404 returned error can't find the container with id 83fcbcc736369e1a4f2afdc9eec350e8f2c7c2b324865725270b38bcb547c6f6 Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.786005 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-5j2w8"] Oct 01 13:24:58 crc kubenswrapper[4749]: I1001 13:24:58.947254 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-h459j"] Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.280570 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.360980 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-config\") pod \"617c6727-b523-4976-bf75-19c87ff896ce\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.361156 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5rvx\" (UniqueName: \"kubernetes.io/projected/617c6727-b523-4976-bf75-19c87ff896ce-kube-api-access-n5rvx\") pod \"617c6727-b523-4976-bf75-19c87ff896ce\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.361265 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-dns-svc\") pod \"617c6727-b523-4976-bf75-19c87ff896ce\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.361282 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-ovsdbserver-sb\") pod \"617c6727-b523-4976-bf75-19c87ff896ce\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.361319 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-ovsdbserver-nb\") pod \"617c6727-b523-4976-bf75-19c87ff896ce\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.361340 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-dns-swift-storage-0\") pod \"617c6727-b523-4976-bf75-19c87ff896ce\" (UID: \"617c6727-b523-4976-bf75-19c87ff896ce\") " Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.370626 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/617c6727-b523-4976-bf75-19c87ff896ce-kube-api-access-n5rvx" (OuterVolumeSpecName: "kube-api-access-n5rvx") pod "617c6727-b523-4976-bf75-19c87ff896ce" (UID: "617c6727-b523-4976-bf75-19c87ff896ce"). InnerVolumeSpecName "kube-api-access-n5rvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.387135 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "617c6727-b523-4976-bf75-19c87ff896ce" (UID: "617c6727-b523-4976-bf75-19c87ff896ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.388953 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "617c6727-b523-4976-bf75-19c87ff896ce" (UID: "617c6727-b523-4976-bf75-19c87ff896ce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.393304 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "617c6727-b523-4976-bf75-19c87ff896ce" (UID: "617c6727-b523-4976-bf75-19c87ff896ce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.397054 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "617c6727-b523-4976-bf75-19c87ff896ce" (UID: "617c6727-b523-4976-bf75-19c87ff896ce"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.427107 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-config" (OuterVolumeSpecName: "config") pod "617c6727-b523-4976-bf75-19c87ff896ce" (UID: "617c6727-b523-4976-bf75-19c87ff896ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.464757 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.464791 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.464799 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.464808 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.464816 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/617c6727-b523-4976-bf75-19c87ff896ce-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.464824 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5rvx\" (UniqueName: \"kubernetes.io/projected/617c6727-b523-4976-bf75-19c87ff896ce-kube-api-access-n5rvx\") on node \"crc\" DevicePath \"\"" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.647912 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.714471 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.723418 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.723504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d4bc6d655-p268f" event={"ID":"617c6727-b523-4976-bf75-19c87ff896ce","Type":"ContainerDied","Data":"8d5b44103355259be2d78d49fc58471d99956e6087e1c2d31c0ca323ccd09a78"} Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.725139 4749 scope.go:117] "RemoveContainer" containerID="6162d6b9837ad548fad1358eab9505e53d450d8961255e3aa2ca4a3b05ccacf5" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.730695 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e52665-f55b-4137-82ec-7ab7392bca61","Type":"ContainerStarted","Data":"13d2c646ddbe1198028ba22d41a228313966c057b9ba0a84fda242c2d4fd0deb"} Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.753129 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59bbdc7665-bcszc"] Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.755999 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b","Type":"ContainerStarted","Data":"98997b25feac3e5bc45d4b2eec6d4f29a9a5d254396383fb8679da4d25a6a337"} Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.756637 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.765234 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fz544" event={"ID":"57aa7c22-d447-48f7-b16b-517c8553dc09","Type":"ContainerStarted","Data":"7e8f952e99a5fa6d951aa22fd3b1ae6372b3b49ec2747a5b680d1990793a70bb"} Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.798262 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d96cdbf7c-fpnt7"] Oct 01 13:24:59 crc kubenswrapper[4749]: E1001 13:24:59.798610 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617c6727-b523-4976-bf75-19c87ff896ce" containerName="init" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.798628 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="617c6727-b523-4976-bf75-19c87ff896ce" containerName="init" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.798826 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="617c6727-b523-4976-bf75-19c87ff896ce" containerName="init" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.800995 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.805076 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d96cdbf7c-fpnt7"] Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.810747 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h459j" event={"ID":"4f535ba4-1d6d-4103-8764-c324341bffdd","Type":"ContainerStarted","Data":"1e2f818f18cc349d92509baf7334070dc169aafeec21bb02c2074589bdda24e3"} Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.810788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h459j" event={"ID":"4f535ba4-1d6d-4103-8764-c324341bffdd","Type":"ContainerStarted","Data":"f64a59da9c721ad80826aa0a0c10493bbc90b3d1dbb66679ea531bc5cbcb82a4"} Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.822720 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.822698477 podStartE2EDuration="2.822698477s" podCreationTimestamp="2025-10-01 13:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:24:59.786780544 +0000 UTC m=+1159.840765443" watchObservedRunningTime="2025-10-01 13:24:59.822698477 +0000 UTC m=+1159.876683376" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.840581 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d4bc6d655-p268f"] Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.857553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-d55n6" event={"ID":"7bcae573-48fa-4920-8b8f-4df57d4c5375","Type":"ContainerStarted","Data":"7ad4fe9694512426bf33f9e9fffb584b0eb79d993f651695d89fa084a7ae5d2c"} Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.857903 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d4bc6d655-p268f"] Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.891611 4749 generic.go:334] "Generic (PLEG): container finished" podID="c79b5ea5-4a70-4868-b72a-bc8efb0cb967" containerID="ad615b39edf900a955c90d5cbdd1a5684726124559af5f04e44d0233af379b78" exitCode=0 Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.891717 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" event={"ID":"c79b5ea5-4a70-4868-b72a-bc8efb0cb967","Type":"ContainerDied","Data":"ad615b39edf900a955c90d5cbdd1a5684726124559af5f04e44d0233af379b78"} Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.891751 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" event={"ID":"c79b5ea5-4a70-4868-b72a-bc8efb0cb967","Type":"ContainerStarted","Data":"83fcbcc736369e1a4f2afdc9eec350e8f2c7c2b324865725270b38bcb547c6f6"} Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.899850 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5j2w8" event={"ID":"9ea02603-ed44-4faa-ae1e-37cf61162fde","Type":"ContainerStarted","Data":"71421e79c0324bf67f0b678a0ad34a7d020a713a53123b01750ca10a9ae5aa2d"} Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.914204 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-h459j" podStartSLOduration=2.914182478 podStartE2EDuration="2.914182478s" podCreationTimestamp="2025-10-01 13:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:24:59.889393475 +0000 UTC m=+1159.943378364" watchObservedRunningTime="2025-10-01 13:24:59.914182478 +0000 UTC m=+1159.968167377" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.964605 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fz544" podStartSLOduration=3.964582627 podStartE2EDuration="3.964582627s" podCreationTimestamp="2025-10-01 13:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:24:59.908605998 +0000 UTC m=+1159.962590897" watchObservedRunningTime="2025-10-01 13:24:59.964582627 +0000 UTC m=+1160.018567536" Oct 01 13:24:59 crc kubenswrapper[4749]: I1001 13:24:59.986869 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-d55n6" podStartSLOduration=4.241809859 podStartE2EDuration="36.986852268s" podCreationTimestamp="2025-10-01 13:24:23 +0000 UTC" firstStartedPulling="2025-10-01 13:24:24.651984612 +0000 UTC m=+1124.705969511" lastFinishedPulling="2025-10-01 13:24:57.397027021 +0000 UTC m=+1157.451011920" observedRunningTime="2025-10-01 13:24:59.948184256 +0000 UTC m=+1160.002169165" watchObservedRunningTime="2025-10-01 13:24:59.986852268 +0000 UTC m=+1160.040837167" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.012401 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00959eed-1bce-4b1a-9978-e978b9bdb4cf-logs\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.012488 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00959eed-1bce-4b1a-9978-e978b9bdb4cf-horizon-secret-key\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.012509 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w674x\" (UniqueName: \"kubernetes.io/projected/00959eed-1bce-4b1a-9978-e978b9bdb4cf-kube-api-access-w674x\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.012677 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00959eed-1bce-4b1a-9978-e978b9bdb4cf-config-data\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.012731 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00959eed-1bce-4b1a-9978-e978b9bdb4cf-scripts\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.113790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00959eed-1bce-4b1a-9978-e978b9bdb4cf-config-data\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.113900 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00959eed-1bce-4b1a-9978-e978b9bdb4cf-scripts\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.113952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00959eed-1bce-4b1a-9978-e978b9bdb4cf-logs\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.113976 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00959eed-1bce-4b1a-9978-e978b9bdb4cf-horizon-secret-key\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.113992 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w674x\" (UniqueName: \"kubernetes.io/projected/00959eed-1bce-4b1a-9978-e978b9bdb4cf-kube-api-access-w674x\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.115537 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00959eed-1bce-4b1a-9978-e978b9bdb4cf-config-data\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.115940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00959eed-1bce-4b1a-9978-e978b9bdb4cf-scripts\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.116135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00959eed-1bce-4b1a-9978-e978b9bdb4cf-logs\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.119186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00959eed-1bce-4b1a-9978-e978b9bdb4cf-horizon-secret-key\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.130869 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w674x\" (UniqueName: \"kubernetes.io/projected/00959eed-1bce-4b1a-9978-e978b9bdb4cf-kube-api-access-w674x\") pod \"horizon-7d96cdbf7c-fpnt7\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.237542 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.915917 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api-log" containerID="cri-o://0e9422034261039d929208334748b5eeee073307340b5829b59c177ef52b029c" gracePeriod=30 Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.916271 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api" containerID="cri-o://98997b25feac3e5bc45d4b2eec6d4f29a9a5d254396383fb8679da4d25a6a337" gracePeriod=30 Oct 01 13:25:00 crc kubenswrapper[4749]: I1001 13:25:00.923635 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": EOF" Oct 01 13:25:01 crc kubenswrapper[4749]: I1001 13:25:01.268877 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="617c6727-b523-4976-bf75-19c87ff896ce" path="/var/lib/kubelet/pods/617c6727-b523-4976-bf75-19c87ff896ce/volumes" Oct 01 13:25:01 crc kubenswrapper[4749]: I1001 13:25:01.931231 4749 generic.go:334] "Generic (PLEG): container finished" podID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerID="0e9422034261039d929208334748b5eeee073307340b5829b59c177ef52b029c" exitCode=143 Oct 01 13:25:01 crc kubenswrapper[4749]: I1001 13:25:01.931303 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b","Type":"ContainerDied","Data":"0e9422034261039d929208334748b5eeee073307340b5829b59c177ef52b029c"} Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.270673 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0725-account-create-nxf2m"] Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.271845 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0725-account-create-nxf2m" Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.274315 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.298982 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0725-account-create-nxf2m"] Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.383616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2k6p\" (UniqueName: \"kubernetes.io/projected/48377824-164a-42bd-9374-f1fadf80ddf6-kube-api-access-l2k6p\") pod \"barbican-0725-account-create-nxf2m\" (UID: \"48377824-164a-42bd-9374-f1fadf80ddf6\") " pod="openstack/barbican-0725-account-create-nxf2m" Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.386047 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4d06-account-create-rwqkp"] Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.387655 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4d06-account-create-rwqkp" Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.390339 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.433421 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4d06-account-create-rwqkp"] Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.485077 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2k6p\" (UniqueName: \"kubernetes.io/projected/48377824-164a-42bd-9374-f1fadf80ddf6-kube-api-access-l2k6p\") pod \"barbican-0725-account-create-nxf2m\" (UID: \"48377824-164a-42bd-9374-f1fadf80ddf6\") " pod="openstack/barbican-0725-account-create-nxf2m" Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.516463 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2k6p\" (UniqueName: \"kubernetes.io/projected/48377824-164a-42bd-9374-f1fadf80ddf6-kube-api-access-l2k6p\") pod \"barbican-0725-account-create-nxf2m\" (UID: \"48377824-164a-42bd-9374-f1fadf80ddf6\") " pod="openstack/barbican-0725-account-create-nxf2m" Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.541377 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.586778 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk5l4\" (UniqueName: \"kubernetes.io/projected/75ae7232-a3d3-43dc-b003-f6e47b5e6868-kube-api-access-pk5l4\") pod \"cinder-4d06-account-create-rwqkp\" (UID: \"75ae7232-a3d3-43dc-b003-f6e47b5e6868\") " pod="openstack/cinder-4d06-account-create-rwqkp" Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.600350 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0725-account-create-nxf2m" Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.688643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk5l4\" (UniqueName: \"kubernetes.io/projected/75ae7232-a3d3-43dc-b003-f6e47b5e6868-kube-api-access-pk5l4\") pod \"cinder-4d06-account-create-rwqkp\" (UID: \"75ae7232-a3d3-43dc-b003-f6e47b5e6868\") " pod="openstack/cinder-4d06-account-create-rwqkp" Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.706667 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk5l4\" (UniqueName: \"kubernetes.io/projected/75ae7232-a3d3-43dc-b003-f6e47b5e6868-kube-api-access-pk5l4\") pod \"cinder-4d06-account-create-rwqkp\" (UID: \"75ae7232-a3d3-43dc-b003-f6e47b5e6868\") " pod="openstack/cinder-4d06-account-create-rwqkp" Oct 01 13:25:02 crc kubenswrapper[4749]: I1001 13:25:02.708277 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4d06-account-create-rwqkp" Oct 01 13:25:03 crc kubenswrapper[4749]: I1001 13:25:03.953478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" event={"ID":"c79b5ea5-4a70-4868-b72a-bc8efb0cb967","Type":"ContainerStarted","Data":"d2bfb15f46d0e5919d2d8e5d14975a28845e64239f7e1258b651e08a349a7316"} Oct 01 13:25:03 crc kubenswrapper[4749]: I1001 13:25:03.953946 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:25:03 crc kubenswrapper[4749]: I1001 13:25:03.987619 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" podStartSLOduration=6.9875996879999995 podStartE2EDuration="6.987599688s" podCreationTimestamp="2025-10-01 13:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:25:03.980636982 +0000 UTC m=+1164.034621881" watchObservedRunningTime="2025-10-01 13:25:03.987599688 +0000 UTC m=+1164.041584577" Oct 01 13:25:04 crc kubenswrapper[4749]: I1001 13:25:04.034064 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 01 13:25:04 crc kubenswrapper[4749]: I1001 13:25:04.670141 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": read tcp 10.217.0.2:46120->10.217.0.150:9322: read: connection reset by peer" Oct 01 13:25:04 crc kubenswrapper[4749]: I1001 13:25:04.965557 4749 generic.go:334] "Generic (PLEG): container finished" podID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerID="98997b25feac3e5bc45d4b2eec6d4f29a9a5d254396383fb8679da4d25a6a337" exitCode=0 Oct 01 13:25:04 crc kubenswrapper[4749]: I1001 13:25:04.965668 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b","Type":"ContainerDied","Data":"98997b25feac3e5bc45d4b2eec6d4f29a9a5d254396383fb8679da4d25a6a337"} Oct 01 13:25:05 crc kubenswrapper[4749]: I1001 13:25:05.979015 4749 generic.go:334] "Generic (PLEG): container finished" podID="57aa7c22-d447-48f7-b16b-517c8553dc09" containerID="7e8f952e99a5fa6d951aa22fd3b1ae6372b3b49ec2747a5b680d1990793a70bb" exitCode=0 Oct 01 13:25:05 crc kubenswrapper[4749]: I1001 13:25:05.979056 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fz544" event={"ID":"57aa7c22-d447-48f7-b16b-517c8553dc09","Type":"ContainerDied","Data":"7e8f952e99a5fa6d951aa22fd3b1ae6372b3b49ec2747a5b680d1990793a70bb"} Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.329819 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-867ff5dd5c-2lf7x"] Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.357499 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84f6c74b66-trbb9"] Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.359101 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.363480 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.373739 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84f6c74b66-trbb9"] Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.437403 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d96cdbf7c-fpnt7"] Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.463910 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b74b5b846-r84t7"] Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.468562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.471363 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-combined-ca-bundle\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.471412 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-horizon-tls-certs\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.471501 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjjtp\" (UniqueName: \"kubernetes.io/projected/7ae81866-7b3d-44f4-a4fd-5b49e2223352-kube-api-access-bjjtp\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.471519 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ae81866-7b3d-44f4-a4fd-5b49e2223352-scripts\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.471537 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae81866-7b3d-44f4-a4fd-5b49e2223352-logs\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.471570 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ae81866-7b3d-44f4-a4fd-5b49e2223352-config-data\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.471594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-horizon-secret-key\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.491337 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b74b5b846-r84t7"] Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.573494 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22321e2-ded2-4732-ac89-f9f0d4dcd199-combined-ca-bundle\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.573555 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22321e2-ded2-4732-ac89-f9f0d4dcd199-logs\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.573584 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e22321e2-ded2-4732-ac89-f9f0d4dcd199-config-data\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.573615 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjjtp\" (UniqueName: \"kubernetes.io/projected/7ae81866-7b3d-44f4-a4fd-5b49e2223352-kube-api-access-bjjtp\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.573637 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ae81866-7b3d-44f4-a4fd-5b49e2223352-scripts\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.573659 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae81866-7b3d-44f4-a4fd-5b49e2223352-logs\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.573681 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e22321e2-ded2-4732-ac89-f9f0d4dcd199-horizon-secret-key\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.573708 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22321e2-ded2-4732-ac89-f9f0d4dcd199-horizon-tls-certs\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.573735 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxbh7\" (UniqueName: \"kubernetes.io/projected/e22321e2-ded2-4732-ac89-f9f0d4dcd199-kube-api-access-cxbh7\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.573842 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ae81866-7b3d-44f4-a4fd-5b49e2223352-config-data\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.573985 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae81866-7b3d-44f4-a4fd-5b49e2223352-logs\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.574320 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-horizon-secret-key\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.574370 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-combined-ca-bundle\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.574421 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e22321e2-ded2-4732-ac89-f9f0d4dcd199-scripts\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.574451 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-horizon-tls-certs\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.574861 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ae81866-7b3d-44f4-a4fd-5b49e2223352-scripts\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.575485 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ae81866-7b3d-44f4-a4fd-5b49e2223352-config-data\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.594666 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-horizon-secret-key\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.594795 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-combined-ca-bundle\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.596805 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-horizon-tls-certs\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.606286 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjjtp\" (UniqueName: \"kubernetes.io/projected/7ae81866-7b3d-44f4-a4fd-5b49e2223352-kube-api-access-bjjtp\") pod \"horizon-84f6c74b66-trbb9\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.676495 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e22321e2-ded2-4732-ac89-f9f0d4dcd199-scripts\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.676622 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22321e2-ded2-4732-ac89-f9f0d4dcd199-combined-ca-bundle\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.676653 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22321e2-ded2-4732-ac89-f9f0d4dcd199-logs\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.676693 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e22321e2-ded2-4732-ac89-f9f0d4dcd199-config-data\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.676722 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e22321e2-ded2-4732-ac89-f9f0d4dcd199-horizon-secret-key\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.676752 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22321e2-ded2-4732-ac89-f9f0d4dcd199-horizon-tls-certs\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.676776 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxbh7\" (UniqueName: \"kubernetes.io/projected/e22321e2-ded2-4732-ac89-f9f0d4dcd199-kube-api-access-cxbh7\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.677608 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e22321e2-ded2-4732-ac89-f9f0d4dcd199-scripts\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.680281 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e22321e2-ded2-4732-ac89-f9f0d4dcd199-config-data\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.680537 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e22321e2-ded2-4732-ac89-f9f0d4dcd199-logs\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.687728 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22321e2-ded2-4732-ac89-f9f0d4dcd199-combined-ca-bundle\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.695647 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e22321e2-ded2-4732-ac89-f9f0d4dcd199-horizon-secret-key\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.696008 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22321e2-ded2-4732-ac89-f9f0d4dcd199-horizon-tls-certs\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.696653 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.709902 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxbh7\" (UniqueName: \"kubernetes.io/projected/e22321e2-ded2-4732-ac89-f9f0d4dcd199-kube-api-access-cxbh7\") pod \"horizon-b74b5b846-r84t7\" (UID: \"e22321e2-ded2-4732-ac89-f9f0d4dcd199\") " pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:06 crc kubenswrapper[4749]: I1001 13:25:06.794514 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:07 crc kubenswrapper[4749]: I1001 13:25:07.540709 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": dial tcp 10.217.0.150:9322: connect: connection refused" Oct 01 13:25:12 crc kubenswrapper[4749]: I1001 13:25:12.820548 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:25:12 crc kubenswrapper[4749]: I1001 13:25:12.910841 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f4bccb99-dh9mp"] Oct 01 13:25:12 crc kubenswrapper[4749]: I1001 13:25:12.911127 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" podUID="9138f50d-1c41-4a52-84f6-2e3fa1e7f245" containerName="dnsmasq-dns" containerID="cri-o://f7713d48470e3e620d6bdb4a22f20f9c3c9dfc31a1997b43431b79f3cbb5dac5" gracePeriod=10 Oct 01 13:25:15 crc kubenswrapper[4749]: I1001 13:25:15.082292 4749 generic.go:334] "Generic (PLEG): container finished" podID="9138f50d-1c41-4a52-84f6-2e3fa1e7f245" containerID="f7713d48470e3e620d6bdb4a22f20f9c3c9dfc31a1997b43431b79f3cbb5dac5" exitCode=0 Oct 01 13:25:15 crc kubenswrapper[4749]: I1001 13:25:15.082398 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" event={"ID":"9138f50d-1c41-4a52-84f6-2e3fa1e7f245","Type":"ContainerDied","Data":"f7713d48470e3e620d6bdb4a22f20f9c3c9dfc31a1997b43431b79f3cbb5dac5"} Oct 01 13:25:15 crc kubenswrapper[4749]: I1001 13:25:15.892365 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" podUID="9138f50d-1c41-4a52-84f6-2e3fa1e7f245" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: connect: connection refused" Oct 01 13:25:18 crc kubenswrapper[4749]: I1001 13:25:17.540385 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:25:18 crc kubenswrapper[4749]: I1001 13:25:17.540991 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 01 13:25:18 crc kubenswrapper[4749]: E1001 13:25:18.304134 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 01 13:25:18 crc kubenswrapper[4749]: E1001 13:25:18.304244 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 01 13:25:18 crc kubenswrapper[4749]: E1001 13:25:18.304423 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.30:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n65bh689hb9h5b6h64dh89h65dh544h58h56ch674h67hbch5bbh555h5d9h68bhd6h5cfh5b8h55ch5c7h574h5f5h64ch65hf9h5cbh9chd5h5d6h5f8q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6qqlt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-59bbdc7665-bcszc_openstack(0bf8736f-97c7-4f81-af11-2cab4dfff4b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:25:18 crc kubenswrapper[4749]: E1001 13:25:18.308851 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.30:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-59bbdc7665-bcszc" podUID="0bf8736f-97c7-4f81-af11-2cab4dfff4b0" Oct 01 13:25:18 crc kubenswrapper[4749]: I1001 13:25:18.913655 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fz544" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.048339 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-combined-ca-bundle\") pod \"57aa7c22-d447-48f7-b16b-517c8553dc09\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.048427 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-fernet-keys\") pod \"57aa7c22-d447-48f7-b16b-517c8553dc09\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.048472 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2qg6\" (UniqueName: \"kubernetes.io/projected/57aa7c22-d447-48f7-b16b-517c8553dc09-kube-api-access-j2qg6\") pod \"57aa7c22-d447-48f7-b16b-517c8553dc09\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.048506 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-scripts\") pod \"57aa7c22-d447-48f7-b16b-517c8553dc09\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.048581 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-credential-keys\") pod \"57aa7c22-d447-48f7-b16b-517c8553dc09\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.048792 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-config-data\") pod \"57aa7c22-d447-48f7-b16b-517c8553dc09\" (UID: \"57aa7c22-d447-48f7-b16b-517c8553dc09\") " Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.053814 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "57aa7c22-d447-48f7-b16b-517c8553dc09" (UID: "57aa7c22-d447-48f7-b16b-517c8553dc09"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.054095 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-scripts" (OuterVolumeSpecName: "scripts") pod "57aa7c22-d447-48f7-b16b-517c8553dc09" (UID: "57aa7c22-d447-48f7-b16b-517c8553dc09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.057071 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "57aa7c22-d447-48f7-b16b-517c8553dc09" (UID: "57aa7c22-d447-48f7-b16b-517c8553dc09"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.057140 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57aa7c22-d447-48f7-b16b-517c8553dc09-kube-api-access-j2qg6" (OuterVolumeSpecName: "kube-api-access-j2qg6") pod "57aa7c22-d447-48f7-b16b-517c8553dc09" (UID: "57aa7c22-d447-48f7-b16b-517c8553dc09"). InnerVolumeSpecName "kube-api-access-j2qg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.084110 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57aa7c22-d447-48f7-b16b-517c8553dc09" (UID: "57aa7c22-d447-48f7-b16b-517c8553dc09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.141557 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fz544" event={"ID":"57aa7c22-d447-48f7-b16b-517c8553dc09","Type":"ContainerDied","Data":"ca628e459558d4b845c8bea941e85fc7f8c316fda189878bbc8bb048f09c2746"} Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.141595 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca628e459558d4b845c8bea941e85fc7f8c316fda189878bbc8bb048f09c2746" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.141624 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fz544" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.144452 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-config-data" (OuterVolumeSpecName: "config-data") pod "57aa7c22-d447-48f7-b16b-517c8553dc09" (UID: "57aa7c22-d447-48f7-b16b-517c8553dc09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.145971 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b","Type":"ContainerDied","Data":"a73a05589d4e2e5987792d3603abf69610dbdd5f9ae77851b2fd4fa4670a71f0"} Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.146010 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a73a05589d4e2e5987792d3603abf69610dbdd5f9ae77851b2fd4fa4670a71f0" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.151239 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.151287 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.151317 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2qg6\" (UniqueName: \"kubernetes.io/projected/57aa7c22-d447-48f7-b16b-517c8553dc09-kube-api-access-j2qg6\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.151335 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.151352 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.151368 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57aa7c22-d447-48f7-b16b-517c8553dc09-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:19 crc kubenswrapper[4749]: E1001 13:25:19.151414 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Oct 01 13:25:19 crc kubenswrapper[4749]: E1001 13:25:19.151455 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Oct 01 13:25:19 crc kubenswrapper[4749]: E1001 13:25:19.151820 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.30:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bdhfbh58hcfh55ch587h56ch5ddhc8h54h585h5c8h598h679h5c7h575h5c5h54dh66h7ch5f6h575hf7h659h5f9h57bhdh87hbfhc6h77h5c8q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8qvcs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(24e52665-f55b-4137-82ec-7ab7392bca61): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:25:19 crc kubenswrapper[4749]: E1001 13:25:19.435307 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 01 13:25:19 crc kubenswrapper[4749]: E1001 13:25:19.435835 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-horizon:watcher_latest" Oct 01 13:25:19 crc kubenswrapper[4749]: E1001 13:25:19.435968 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.102.83.30:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c7h57hf4h66bh68bh59fhddhc4h67fh546h55dh67ch689h99h5d7hb4h57ch564h54dh55ch8h57ch598h5f8hb6h6dh57dh594h5f5hbch56bhc9q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f86cj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-867ff5dd5c-2lf7x_openstack(21d4a7bd-568e-4b4f-854a-ec3963262172): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:25:19 crc kubenswrapper[4749]: E1001 13:25:19.438495 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.30:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-867ff5dd5c-2lf7x" podUID="21d4a7bd-568e-4b4f-854a-ec3963262172" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.460267 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.564966 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-combined-ca-bundle\") pod \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.565059 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-logs\") pod \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.565097 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-custom-prometheus-ca\") pod \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.565152 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-config-data\") pod \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.565186 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prs8b\" (UniqueName: \"kubernetes.io/projected/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-kube-api-access-prs8b\") pod \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\" (UID: \"8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b\") " Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.565787 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-logs" (OuterVolumeSpecName: "logs") pod "8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" (UID: "8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.566293 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.572253 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-kube-api-access-prs8b" (OuterVolumeSpecName: "kube-api-access-prs8b") pod "8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" (UID: "8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b"). InnerVolumeSpecName "kube-api-access-prs8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.592755 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" (UID: "8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.593388 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" (UID: "8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.631781 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-config-data" (OuterVolumeSpecName: "config-data") pod "8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" (UID: "8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.673416 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.673454 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prs8b\" (UniqueName: \"kubernetes.io/projected/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-kube-api-access-prs8b\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.673467 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.673478 4749 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.960919 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:25:19 crc kubenswrapper[4749]: I1001 13:25:19.969806 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.004414 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b74b5b846-r84t7"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.032356 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d96cdbf7c-fpnt7"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.039449 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0725-account-create-nxf2m"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.083899 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4d06-account-create-rwqkp"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.090309 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84f6c74b66-trbb9"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.090767 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-ovsdbserver-nb\") pod \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.090827 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-horizon-secret-key\") pod \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.090857 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-config\") pod \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.090937 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-dns-swift-storage-0\") pod \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.090979 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-ovsdbserver-sb\") pod \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.091001 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-logs\") pod \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.091079 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-scripts\") pod \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.091099 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z64tx\" (UniqueName: \"kubernetes.io/projected/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-kube-api-access-z64tx\") pod \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.091126 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-dns-svc\") pod \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\" (UID: \"9138f50d-1c41-4a52-84f6-2e3fa1e7f245\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.091149 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qqlt\" (UniqueName: \"kubernetes.io/projected/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-kube-api-access-6qqlt\") pod \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.091181 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-config-data\") pod \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\" (UID: \"0bf8736f-97c7-4f81-af11-2cab4dfff4b0\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.092091 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-config-data" (OuterVolumeSpecName: "config-data") pod "0bf8736f-97c7-4f81-af11-2cab4dfff4b0" (UID: "0bf8736f-97c7-4f81-af11-2cab4dfff4b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.095558 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-scripts" (OuterVolumeSpecName: "scripts") pod "0bf8736f-97c7-4f81-af11-2cab4dfff4b0" (UID: "0bf8736f-97c7-4f81-af11-2cab4dfff4b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.095627 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-logs" (OuterVolumeSpecName: "logs") pod "0bf8736f-97c7-4f81-af11-2cab4dfff4b0" (UID: "0bf8736f-97c7-4f81-af11-2cab4dfff4b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.099432 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fz544"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.102347 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0bf8736f-97c7-4f81-af11-2cab4dfff4b0" (UID: "0bf8736f-97c7-4f81-af11-2cab4dfff4b0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.104088 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-kube-api-access-6qqlt" (OuterVolumeSpecName: "kube-api-access-6qqlt") pod "0bf8736f-97c7-4f81-af11-2cab4dfff4b0" (UID: "0bf8736f-97c7-4f81-af11-2cab4dfff4b0"). InnerVolumeSpecName "kube-api-access-6qqlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.117964 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-kube-api-access-z64tx" (OuterVolumeSpecName: "kube-api-access-z64tx") pod "9138f50d-1c41-4a52-84f6-2e3fa1e7f245" (UID: "9138f50d-1c41-4a52-84f6-2e3fa1e7f245"). InnerVolumeSpecName "kube-api-access-z64tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.124244 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fz544"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.139816 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8zmhd"] Oct 01 13:25:20 crc kubenswrapper[4749]: E1001 13:25:20.140147 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57aa7c22-d447-48f7-b16b-517c8553dc09" containerName="keystone-bootstrap" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.140165 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="57aa7c22-d447-48f7-b16b-517c8553dc09" containerName="keystone-bootstrap" Oct 01 13:25:20 crc kubenswrapper[4749]: E1001 13:25:20.140197 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9138f50d-1c41-4a52-84f6-2e3fa1e7f245" containerName="init" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.140203 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9138f50d-1c41-4a52-84f6-2e3fa1e7f245" containerName="init" Oct 01 13:25:20 crc kubenswrapper[4749]: E1001 13:25:20.140230 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.140237 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api" Oct 01 13:25:20 crc kubenswrapper[4749]: E1001 13:25:20.140249 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api-log" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.140255 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api-log" Oct 01 13:25:20 crc kubenswrapper[4749]: E1001 13:25:20.140265 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9138f50d-1c41-4a52-84f6-2e3fa1e7f245" containerName="dnsmasq-dns" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.140270 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9138f50d-1c41-4a52-84f6-2e3fa1e7f245" containerName="dnsmasq-dns" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.140453 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api-log" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.140467 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9138f50d-1c41-4a52-84f6-2e3fa1e7f245" containerName="dnsmasq-dns" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.140482 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="57aa7c22-d447-48f7-b16b-517c8553dc09" containerName="keystone-bootstrap" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.140492 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.141524 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.145589 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.146134 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-776bb" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.148888 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.154638 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8zmhd"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.157929 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.161494 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9138f50d-1c41-4a52-84f6-2e3fa1e7f245" (UID: "9138f50d-1c41-4a52-84f6-2e3fa1e7f245"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.168906 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.169464 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f4bccb99-dh9mp" event={"ID":"9138f50d-1c41-4a52-84f6-2e3fa1e7f245","Type":"ContainerDied","Data":"709e9de2268a803099cc88ccc907480b8bf03330027c039cbe1a8c622a601b57"} Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.169530 4749 scope.go:117] "RemoveContainer" containerID="f7713d48470e3e620d6bdb4a22f20f9c3c9dfc31a1997b43431b79f3cbb5dac5" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.170044 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9138f50d-1c41-4a52-84f6-2e3fa1e7f245" (UID: "9138f50d-1c41-4a52-84f6-2e3fa1e7f245"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.184658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59bbdc7665-bcszc" event={"ID":"0bf8736f-97c7-4f81-af11-2cab4dfff4b0","Type":"ContainerDied","Data":"dbef047dd9d8febf3f7739bb6bd2b84a2420a15a93a3b7db2b51801b704a12b6"} Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.184794 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59bbdc7665-bcszc" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.192750 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z64tx\" (UniqueName: \"kubernetes.io/projected/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-kube-api-access-z64tx\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.192769 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.192778 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.192786 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qqlt\" (UniqueName: \"kubernetes.io/projected/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-kube-api-access-6qqlt\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.192796 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.192804 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.192813 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.192820 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bf8736f-97c7-4f81-af11-2cab4dfff4b0-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.191642 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"dd95f834-908d-4a7c-8552-b93425ae5dd8","Type":"ContainerStarted","Data":"527dd8d48c1e873adbfbaf3ed96221450695094899745892bd5f87a9ba412828"} Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.196562 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"fab3f7e4-b73c-4e23-839f-719e5b8ca205","Type":"ContainerStarted","Data":"7eb16ed1d2c410890330eb87312d4001c4e15355f7efe7f1e591cd47470841b6"} Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.199766 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9138f50d-1c41-4a52-84f6-2e3fa1e7f245" (UID: "9138f50d-1c41-4a52-84f6-2e3fa1e7f245"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.206594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5j2w8" event={"ID":"9ea02603-ed44-4faa-ae1e-37cf61162fde","Type":"ContainerStarted","Data":"5aefa383a53b9a4e514e0c59c7f8455b1c72f300ad38b83b47060ca18c70a2a1"} Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.206671 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.231689 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.054732606 podStartE2EDuration="23.231669641s" podCreationTimestamp="2025-10-01 13:24:57 +0000 UTC" firstStartedPulling="2025-10-01 13:24:58.617589529 +0000 UTC m=+1158.671574428" lastFinishedPulling="2025-10-01 13:25:18.794526564 +0000 UTC m=+1178.848511463" observedRunningTime="2025-10-01 13:25:20.214805572 +0000 UTC m=+1180.268790471" watchObservedRunningTime="2025-10-01 13:25:20.231669641 +0000 UTC m=+1180.285654530" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.242821 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9138f50d-1c41-4a52-84f6-2e3fa1e7f245" (UID: "9138f50d-1c41-4a52-84f6-2e3fa1e7f245"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.258771 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-5j2w8" podStartSLOduration=2.944876931 podStartE2EDuration="23.258749644s" podCreationTimestamp="2025-10-01 13:24:57 +0000 UTC" firstStartedPulling="2025-10-01 13:24:58.780186865 +0000 UTC m=+1158.834171764" lastFinishedPulling="2025-10-01 13:25:19.094059578 +0000 UTC m=+1179.148044477" observedRunningTime="2025-10-01 13:25:20.250143389 +0000 UTC m=+1180.304128298" watchObservedRunningTime="2025-10-01 13:25:20.258749644 +0000 UTC m=+1180.312734543" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.264181 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-config" (OuterVolumeSpecName: "config") pod "9138f50d-1c41-4a52-84f6-2e3fa1e7f245" (UID: "9138f50d-1c41-4a52-84f6-2e3fa1e7f245"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.268573 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.789696635 podStartE2EDuration="23.268553034s" podCreationTimestamp="2025-10-01 13:24:57 +0000 UTC" firstStartedPulling="2025-10-01 13:24:58.315580773 +0000 UTC m=+1158.369565672" lastFinishedPulling="2025-10-01 13:25:18.794437132 +0000 UTC m=+1178.848422071" observedRunningTime="2025-10-01 13:25:20.267495433 +0000 UTC m=+1180.321480322" watchObservedRunningTime="2025-10-01 13:25:20.268553034 +0000 UTC m=+1180.322537933" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.301620 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-scripts\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.301667 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxj4x\" (UniqueName: \"kubernetes.io/projected/f32f512f-9aba-40b8-9f16-bcb1151eab3f-kube-api-access-pxj4x\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.301775 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-fernet-keys\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.301894 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-credential-keys\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.301990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-config-data\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.302029 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-combined-ca-bundle\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.302162 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.302180 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.302191 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9138f50d-1c41-4a52-84f6-2e3fa1e7f245-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.316702 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.333429 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.342697 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.344657 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.348089 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.367660 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.376878 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59bbdc7665-bcszc"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.394075 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-59bbdc7665-bcszc"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.403294 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a825f7ae-11ba-4e63-8996-12e4b7b6216b-logs\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.403336 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.403404 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-config-data\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.403455 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-scripts\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.403472 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxj4x\" (UniqueName: \"kubernetes.io/projected/f32f512f-9aba-40b8-9f16-bcb1151eab3f-kube-api-access-pxj4x\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.403526 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-fernet-keys\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.403543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5h7d\" (UniqueName: \"kubernetes.io/projected/a825f7ae-11ba-4e63-8996-12e4b7b6216b-kube-api-access-d5h7d\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.403577 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.403613 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-credential-keys\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.403654 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-config-data\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.403678 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-combined-ca-bundle\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.406590 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-combined-ca-bundle\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.408572 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-credential-keys\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.409563 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-fernet-keys\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.411145 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-scripts\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.412373 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-config-data\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.423674 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxj4x\" (UniqueName: \"kubernetes.io/projected/f32f512f-9aba-40b8-9f16-bcb1151eab3f-kube-api-access-pxj4x\") pod \"keystone-bootstrap-8zmhd\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.506062 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-config-data\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.506328 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5h7d\" (UniqueName: \"kubernetes.io/projected/a825f7ae-11ba-4e63-8996-12e4b7b6216b-kube-api-access-d5h7d\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.506362 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.506482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a825f7ae-11ba-4e63-8996-12e4b7b6216b-logs\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.506527 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.509530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a825f7ae-11ba-4e63-8996-12e4b7b6216b-logs\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.514036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-config-data\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.514112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.512271 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.521423 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f4bccb99-dh9mp"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.524777 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5h7d\" (UniqueName: \"kubernetes.io/projected/a825f7ae-11ba-4e63-8996-12e4b7b6216b-kube-api-access-d5h7d\") pod \"watcher-api-0\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.528801 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f4bccb99-dh9mp"] Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.575739 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:20 crc kubenswrapper[4749]: W1001 13:25:20.639833 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ae81866_7b3d_44f4_a4fd_5b49e2223352.slice/crio-1bc40e0de8b2d1ef8f2d2792a39a5ca4e4e09a2a1e2054a2fcd8c584b0d11f60 WatchSource:0}: Error finding container 1bc40e0de8b2d1ef8f2d2792a39a5ca4e4e09a2a1e2054a2fcd8c584b0d11f60: Status 404 returned error can't find the container with id 1bc40e0de8b2d1ef8f2d2792a39a5ca4e4e09a2a1e2054a2fcd8c584b0d11f60 Oct 01 13:25:20 crc kubenswrapper[4749]: W1001 13:25:20.640867 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48377824_164a_42bd_9374_f1fadf80ddf6.slice/crio-63fd910d7b19c30d684700a34b41104320e5d03aa05a16aa4dff764cb1d5fd9b WatchSource:0}: Error finding container 63fd910d7b19c30d684700a34b41104320e5d03aa05a16aa4dff764cb1d5fd9b: Status 404 returned error can't find the container with id 63fd910d7b19c30d684700a34b41104320e5d03aa05a16aa4dff764cb1d5fd9b Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.650447 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.673769 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.702204 4749 scope.go:117] "RemoveContainer" containerID="151966dc1dc660950b8d52ac40df122796bac5e5eea85049d947ad7e333fad29" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.708379 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f86cj\" (UniqueName: \"kubernetes.io/projected/21d4a7bd-568e-4b4f-854a-ec3963262172-kube-api-access-f86cj\") pod \"21d4a7bd-568e-4b4f-854a-ec3963262172\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.708453 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/21d4a7bd-568e-4b4f-854a-ec3963262172-horizon-secret-key\") pod \"21d4a7bd-568e-4b4f-854a-ec3963262172\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.708479 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21d4a7bd-568e-4b4f-854a-ec3963262172-config-data\") pod \"21d4a7bd-568e-4b4f-854a-ec3963262172\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.708559 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21d4a7bd-568e-4b4f-854a-ec3963262172-scripts\") pod \"21d4a7bd-568e-4b4f-854a-ec3963262172\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.708644 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21d4a7bd-568e-4b4f-854a-ec3963262172-logs\") pod \"21d4a7bd-568e-4b4f-854a-ec3963262172\" (UID: \"21d4a7bd-568e-4b4f-854a-ec3963262172\") " Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.709465 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21d4a7bd-568e-4b4f-854a-ec3963262172-logs" (OuterVolumeSpecName: "logs") pod "21d4a7bd-568e-4b4f-854a-ec3963262172" (UID: "21d4a7bd-568e-4b4f-854a-ec3963262172"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.709509 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21d4a7bd-568e-4b4f-854a-ec3963262172-config-data" (OuterVolumeSpecName: "config-data") pod "21d4a7bd-568e-4b4f-854a-ec3963262172" (UID: "21d4a7bd-568e-4b4f-854a-ec3963262172"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.709540 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21d4a7bd-568e-4b4f-854a-ec3963262172-scripts" (OuterVolumeSpecName: "scripts") pod "21d4a7bd-568e-4b4f-854a-ec3963262172" (UID: "21d4a7bd-568e-4b4f-854a-ec3963262172"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.713246 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d4a7bd-568e-4b4f-854a-ec3963262172-kube-api-access-f86cj" (OuterVolumeSpecName: "kube-api-access-f86cj") pod "21d4a7bd-568e-4b4f-854a-ec3963262172" (UID: "21d4a7bd-568e-4b4f-854a-ec3963262172"). InnerVolumeSpecName "kube-api-access-f86cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.713387 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d4a7bd-568e-4b4f-854a-ec3963262172-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "21d4a7bd-568e-4b4f-854a-ec3963262172" (UID: "21d4a7bd-568e-4b4f-854a-ec3963262172"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.810254 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21d4a7bd-568e-4b4f-854a-ec3963262172-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.810278 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21d4a7bd-568e-4b4f-854a-ec3963262172-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.810287 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f86cj\" (UniqueName: \"kubernetes.io/projected/21d4a7bd-568e-4b4f-854a-ec3963262172-kube-api-access-f86cj\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.810297 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/21d4a7bd-568e-4b4f-854a-ec3963262172-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:20 crc kubenswrapper[4749]: I1001 13:25:20.810305 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21d4a7bd-568e-4b4f-854a-ec3963262172-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.271852 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf8736f-97c7-4f81-af11-2cab4dfff4b0" path="/var/lib/kubelet/pods/0bf8736f-97c7-4f81-af11-2cab4dfff4b0/volumes" Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.272306 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57aa7c22-d447-48f7-b16b-517c8553dc09" path="/var/lib/kubelet/pods/57aa7c22-d447-48f7-b16b-517c8553dc09/volumes" Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.273984 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" path="/var/lib/kubelet/pods/8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b/volumes" Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.275123 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9138f50d-1c41-4a52-84f6-2e3fa1e7f245" path="/var/lib/kubelet/pods/9138f50d-1c41-4a52-84f6-2e3fa1e7f245/volumes" Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.275943 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84f6c74b66-trbb9" event={"ID":"7ae81866-7b3d-44f4-a4fd-5b49e2223352","Type":"ContainerStarted","Data":"1bc40e0de8b2d1ef8f2d2792a39a5ca4e4e09a2a1e2054a2fcd8c584b0d11f60"} Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.275967 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0725-account-create-nxf2m" event={"ID":"48377824-164a-42bd-9374-f1fadf80ddf6","Type":"ContainerStarted","Data":"810c58ee7255f52da3bbd535560e2957e9307c16dd19359a94c131804b3326aa"} Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.275979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0725-account-create-nxf2m" event={"ID":"48377824-164a-42bd-9374-f1fadf80ddf6","Type":"ContainerStarted","Data":"63fd910d7b19c30d684700a34b41104320e5d03aa05a16aa4dff764cb1d5fd9b"} Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.279962 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d96cdbf7c-fpnt7" event={"ID":"00959eed-1bce-4b1a-9978-e978b9bdb4cf","Type":"ContainerStarted","Data":"f51535870729f017a6f692d292a76cf1eac326a0a301cfebf29c19eea1651302"} Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.282618 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b74b5b846-r84t7" event={"ID":"e22321e2-ded2-4732-ac89-f9f0d4dcd199","Type":"ContainerStarted","Data":"890f8a1ae91ebd8e927ab0a847cce643d7fbe3f97edefabcd035376df1aa5057"} Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.286872 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-867ff5dd5c-2lf7x" event={"ID":"21d4a7bd-568e-4b4f-854a-ec3963262172","Type":"ContainerDied","Data":"19a9229bbd35a9fb02509a56615bc8f5abb556125be38e09eeab5b9f105f08f6"} Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.286997 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-867ff5dd5c-2lf7x" Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.298169 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4d06-account-create-rwqkp" event={"ID":"75ae7232-a3d3-43dc-b003-f6e47b5e6868","Type":"ContainerStarted","Data":"30f5468ddca0eb18f8466a6f923fe32246dbf068e3adac0b4be4d2c425be1360"} Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.331862 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8zmhd"] Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.379036 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-0725-account-create-nxf2m" podStartSLOduration=19.379011922 podStartE2EDuration="19.379011922s" podCreationTimestamp="2025-10-01 13:25:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:25:21.363628736 +0000 UTC m=+1181.417613635" watchObservedRunningTime="2025-10-01 13:25:21.379011922 +0000 UTC m=+1181.432996841" Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.416442 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-867ff5dd5c-2lf7x"] Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.428724 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-867ff5dd5c-2lf7x"] Oct 01 13:25:21 crc kubenswrapper[4749]: I1001 13:25:21.454303 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.307472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8zmhd" event={"ID":"f32f512f-9aba-40b8-9f16-bcb1151eab3f","Type":"ContainerStarted","Data":"3fb95f4cc56a0375e71e2b3c51326e8e57407bd5a387962adb5b31a1bd5478ad"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.307523 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8zmhd" event={"ID":"f32f512f-9aba-40b8-9f16-bcb1151eab3f","Type":"ContainerStarted","Data":"d71056b9bbad7d19ef12622c0a9c4c06d0fbc74e24aeae388af50a03ce3567f6"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.309091 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d96cdbf7c-fpnt7" event={"ID":"00959eed-1bce-4b1a-9978-e978b9bdb4cf","Type":"ContainerStarted","Data":"66530ca5a8ae469488d794e90dd097c3615d835cd05524f07db2510f26e1b8e0"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.309138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d96cdbf7c-fpnt7" event={"ID":"00959eed-1bce-4b1a-9978-e978b9bdb4cf","Type":"ContainerStarted","Data":"5c6a4dca9736198a826b2581e2b71aac1596e33d3187fafb784fdac24534fce0"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.309181 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d96cdbf7c-fpnt7" podUID="00959eed-1bce-4b1a-9978-e978b9bdb4cf" containerName="horizon-log" containerID="cri-o://5c6a4dca9736198a826b2581e2b71aac1596e33d3187fafb784fdac24534fce0" gracePeriod=30 Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.309229 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d96cdbf7c-fpnt7" podUID="00959eed-1bce-4b1a-9978-e978b9bdb4cf" containerName="horizon" containerID="cri-o://66530ca5a8ae469488d794e90dd097c3615d835cd05524f07db2510f26e1b8e0" gracePeriod=30 Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.315415 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b74b5b846-r84t7" event={"ID":"e22321e2-ded2-4732-ac89-f9f0d4dcd199","Type":"ContainerStarted","Data":"2563acaa828e117607c9684b2e8c1e19eaa5311431dc2ca431006145aa5f77a8"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.315871 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b74b5b846-r84t7" event={"ID":"e22321e2-ded2-4732-ac89-f9f0d4dcd199","Type":"ContainerStarted","Data":"6265d6bca99b9736f4c6d44335a054b5ebdddd8b358758604dff711692bd91cc"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.317150 4749 generic.go:334] "Generic (PLEG): container finished" podID="75ae7232-a3d3-43dc-b003-f6e47b5e6868" containerID="f2c7a8ac184830fdaabe91a4deed389ee9f47f9f066b45e87c69b75bc57bce8e" exitCode=0 Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.317203 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4d06-account-create-rwqkp" event={"ID":"75ae7232-a3d3-43dc-b003-f6e47b5e6868","Type":"ContainerDied","Data":"f2c7a8ac184830fdaabe91a4deed389ee9f47f9f066b45e87c69b75bc57bce8e"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.326036 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a825f7ae-11ba-4e63-8996-12e4b7b6216b","Type":"ContainerStarted","Data":"df69449cb75afde5077944a7c210733d4e770a4ec99aa4e72823174395f223c2"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.326078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a825f7ae-11ba-4e63-8996-12e4b7b6216b","Type":"ContainerStarted","Data":"91f8c3ba19c8a5c8bb236248a8de1ad52b586be4364bd154c8dc4552652bb22c"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.326088 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a825f7ae-11ba-4e63-8996-12e4b7b6216b","Type":"ContainerStarted","Data":"e6a0e04386f22018b4e8bfd587272eb4e785b3194d3ebe4f47aa2551d81c90bc"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.326272 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.327925 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": dial tcp 10.217.0.162:9322: connect: connection refused" Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.337446 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8zmhd" podStartSLOduration=2.337430756 podStartE2EDuration="2.337430756s" podCreationTimestamp="2025-10-01 13:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:25:22.330266503 +0000 UTC m=+1182.384251402" watchObservedRunningTime="2025-10-01 13:25:22.337430756 +0000 UTC m=+1182.391415655" Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.342444 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84f6c74b66-trbb9" event={"ID":"7ae81866-7b3d-44f4-a4fd-5b49e2223352","Type":"ContainerStarted","Data":"a94805c30f134bb40119ae76ad5949c8770747149e8665ebf0949b027b1ecb2a"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.345560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84f6c74b66-trbb9" event={"ID":"7ae81866-7b3d-44f4-a4fd-5b49e2223352","Type":"ContainerStarted","Data":"070469cc902a7a3114f0e257a6257c4a6d561f9bd6e5bdf6b9ee86f6304d2e5f"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.349746 4749 generic.go:334] "Generic (PLEG): container finished" podID="48377824-164a-42bd-9374-f1fadf80ddf6" containerID="810c58ee7255f52da3bbd535560e2957e9307c16dd19359a94c131804b3326aa" exitCode=0 Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.349802 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0725-account-create-nxf2m" event={"ID":"48377824-164a-42bd-9374-f1fadf80ddf6","Type":"ContainerDied","Data":"810c58ee7255f52da3bbd535560e2957e9307c16dd19359a94c131804b3326aa"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.361525 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e52665-f55b-4137-82ec-7ab7392bca61","Type":"ContainerStarted","Data":"88a4616d6bd1607daba64394bdfadf2bab1654bff024f2f5b236321a1f132f85"} Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.388702 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7d96cdbf7c-fpnt7" podStartSLOduration=23.185494184 podStartE2EDuration="23.388683044s" podCreationTimestamp="2025-10-01 13:24:59 +0000 UTC" firstStartedPulling="2025-10-01 13:25:20.704458908 +0000 UTC m=+1180.758443817" lastFinishedPulling="2025-10-01 13:25:20.907647778 +0000 UTC m=+1180.961632677" observedRunningTime="2025-10-01 13:25:22.363442446 +0000 UTC m=+1182.417427345" watchObservedRunningTime="2025-10-01 13:25:22.388683044 +0000 UTC m=+1182.442667943" Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.416474 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b74b5b846-r84t7" podStartSLOduration=16.164633067 podStartE2EDuration="16.416458967s" podCreationTimestamp="2025-10-01 13:25:06 +0000 UTC" firstStartedPulling="2025-10-01 13:25:20.703525691 +0000 UTC m=+1180.757510600" lastFinishedPulling="2025-10-01 13:25:20.955351601 +0000 UTC m=+1181.009336500" observedRunningTime="2025-10-01 13:25:22.412693925 +0000 UTC m=+1182.466678824" watchObservedRunningTime="2025-10-01 13:25:22.416458967 +0000 UTC m=+1182.470443866" Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.452286 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=2.4522683880000002 podStartE2EDuration="2.452268388s" podCreationTimestamp="2025-10-01 13:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:25:22.451589518 +0000 UTC m=+1182.505574417" watchObservedRunningTime="2025-10-01 13:25:22.452268388 +0000 UTC m=+1182.506253287" Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.504669 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-84f6c74b66-trbb9" podStartSLOduration=16.258461977 podStartE2EDuration="16.50465102s" podCreationTimestamp="2025-10-01 13:25:06 +0000 UTC" firstStartedPulling="2025-10-01 13:25:20.642132652 +0000 UTC m=+1180.696117551" lastFinishedPulling="2025-10-01 13:25:20.888321695 +0000 UTC m=+1180.942306594" observedRunningTime="2025-10-01 13:25:22.495941572 +0000 UTC m=+1182.549926471" watchObservedRunningTime="2025-10-01 13:25:22.50465102 +0000 UTC m=+1182.558635909" Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.542738 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="8399b1b5-243a-4eec-a8ef-0ee3c5d7b45b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.150:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:25:22 crc kubenswrapper[4749]: I1001 13:25:22.657657 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 01 13:25:23 crc kubenswrapper[4749]: I1001 13:25:23.239187 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21d4a7bd-568e-4b4f-854a-ec3963262172" path="/var/lib/kubelet/pods/21d4a7bd-568e-4b4f-854a-ec3963262172/volumes" Oct 01 13:25:23 crc kubenswrapper[4749]: I1001 13:25:23.859566 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0725-account-create-nxf2m" Oct 01 13:25:23 crc kubenswrapper[4749]: I1001 13:25:23.870641 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4d06-account-create-rwqkp" Oct 01 13:25:24 crc kubenswrapper[4749]: I1001 13:25:24.029516 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2k6p\" (UniqueName: \"kubernetes.io/projected/48377824-164a-42bd-9374-f1fadf80ddf6-kube-api-access-l2k6p\") pod \"48377824-164a-42bd-9374-f1fadf80ddf6\" (UID: \"48377824-164a-42bd-9374-f1fadf80ddf6\") " Oct 01 13:25:24 crc kubenswrapper[4749]: I1001 13:25:24.029582 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk5l4\" (UniqueName: \"kubernetes.io/projected/75ae7232-a3d3-43dc-b003-f6e47b5e6868-kube-api-access-pk5l4\") pod \"75ae7232-a3d3-43dc-b003-f6e47b5e6868\" (UID: \"75ae7232-a3d3-43dc-b003-f6e47b5e6868\") " Oct 01 13:25:24 crc kubenswrapper[4749]: I1001 13:25:24.035482 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48377824-164a-42bd-9374-f1fadf80ddf6-kube-api-access-l2k6p" (OuterVolumeSpecName: "kube-api-access-l2k6p") pod "48377824-164a-42bd-9374-f1fadf80ddf6" (UID: "48377824-164a-42bd-9374-f1fadf80ddf6"). InnerVolumeSpecName "kube-api-access-l2k6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:24 crc kubenswrapper[4749]: I1001 13:25:24.052857 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75ae7232-a3d3-43dc-b003-f6e47b5e6868-kube-api-access-pk5l4" (OuterVolumeSpecName: "kube-api-access-pk5l4") pod "75ae7232-a3d3-43dc-b003-f6e47b5e6868" (UID: "75ae7232-a3d3-43dc-b003-f6e47b5e6868"). InnerVolumeSpecName "kube-api-access-pk5l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:24 crc kubenswrapper[4749]: I1001 13:25:24.130760 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2k6p\" (UniqueName: \"kubernetes.io/projected/48377824-164a-42bd-9374-f1fadf80ddf6-kube-api-access-l2k6p\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:24 crc kubenswrapper[4749]: I1001 13:25:24.130791 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk5l4\" (UniqueName: \"kubernetes.io/projected/75ae7232-a3d3-43dc-b003-f6e47b5e6868-kube-api-access-pk5l4\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:24 crc kubenswrapper[4749]: I1001 13:25:24.380247 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0725-account-create-nxf2m" event={"ID":"48377824-164a-42bd-9374-f1fadf80ddf6","Type":"ContainerDied","Data":"63fd910d7b19c30d684700a34b41104320e5d03aa05a16aa4dff764cb1d5fd9b"} Oct 01 13:25:24 crc kubenswrapper[4749]: I1001 13:25:24.380287 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63fd910d7b19c30d684700a34b41104320e5d03aa05a16aa4dff764cb1d5fd9b" Oct 01 13:25:24 crc kubenswrapper[4749]: I1001 13:25:24.380300 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0725-account-create-nxf2m" Oct 01 13:25:24 crc kubenswrapper[4749]: I1001 13:25:24.381843 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4d06-account-create-rwqkp" event={"ID":"75ae7232-a3d3-43dc-b003-f6e47b5e6868","Type":"ContainerDied","Data":"30f5468ddca0eb18f8466a6f923fe32246dbf068e3adac0b4be4d2c425be1360"} Oct 01 13:25:24 crc kubenswrapper[4749]: I1001 13:25:24.381866 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30f5468ddca0eb18f8466a6f923fe32246dbf068e3adac0b4be4d2c425be1360" Oct 01 13:25:24 crc kubenswrapper[4749]: I1001 13:25:24.381914 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4d06-account-create-rwqkp" Oct 01 13:25:25 crc kubenswrapper[4749]: I1001 13:25:25.674818 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 01 13:25:25 crc kubenswrapper[4749]: I1001 13:25:25.852587 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 01 13:25:26 crc kubenswrapper[4749]: I1001 13:25:26.406780 4749 generic.go:334] "Generic (PLEG): container finished" podID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerID="7eb16ed1d2c410890330eb87312d4001c4e15355f7efe7f1e591cd47470841b6" exitCode=1 Oct 01 13:25:26 crc kubenswrapper[4749]: I1001 13:25:26.406870 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"fab3f7e4-b73c-4e23-839f-719e5b8ca205","Type":"ContainerDied","Data":"7eb16ed1d2c410890330eb87312d4001c4e15355f7efe7f1e591cd47470841b6"} Oct 01 13:25:26 crc kubenswrapper[4749]: I1001 13:25:26.408088 4749 scope.go:117] "RemoveContainer" containerID="7eb16ed1d2c410890330eb87312d4001c4e15355f7efe7f1e591cd47470841b6" Oct 01 13:25:26 crc kubenswrapper[4749]: I1001 13:25:26.409158 4749 generic.go:334] "Generic (PLEG): container finished" podID="9ea02603-ed44-4faa-ae1e-37cf61162fde" containerID="5aefa383a53b9a4e514e0c59c7f8455b1c72f300ad38b83b47060ca18c70a2a1" exitCode=0 Oct 01 13:25:26 crc kubenswrapper[4749]: I1001 13:25:26.409203 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5j2w8" event={"ID":"9ea02603-ed44-4faa-ae1e-37cf61162fde","Type":"ContainerDied","Data":"5aefa383a53b9a4e514e0c59c7f8455b1c72f300ad38b83b47060ca18c70a2a1"} Oct 01 13:25:26 crc kubenswrapper[4749]: I1001 13:25:26.410878 4749 generic.go:334] "Generic (PLEG): container finished" podID="f32f512f-9aba-40b8-9f16-bcb1151eab3f" containerID="3fb95f4cc56a0375e71e2b3c51326e8e57407bd5a387962adb5b31a1bd5478ad" exitCode=0 Oct 01 13:25:26 crc kubenswrapper[4749]: I1001 13:25:26.411013 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8zmhd" event={"ID":"f32f512f-9aba-40b8-9f16-bcb1151eab3f","Type":"ContainerDied","Data":"3fb95f4cc56a0375e71e2b3c51326e8e57407bd5a387962adb5b31a1bd5478ad"} Oct 01 13:25:26 crc kubenswrapper[4749]: I1001 13:25:26.697943 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:26 crc kubenswrapper[4749]: I1001 13:25:26.698017 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:26 crc kubenswrapper[4749]: I1001 13:25:26.794740 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:26 crc kubenswrapper[4749]: I1001 13:25:26.795736 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.456457 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.456504 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.456515 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.456524 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.580273 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-6mlkv"] Oct 01 13:25:27 crc kubenswrapper[4749]: E1001 13:25:27.580679 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75ae7232-a3d3-43dc-b003-f6e47b5e6868" containerName="mariadb-account-create" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.580700 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="75ae7232-a3d3-43dc-b003-f6e47b5e6868" containerName="mariadb-account-create" Oct 01 13:25:27 crc kubenswrapper[4749]: E1001 13:25:27.580740 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48377824-164a-42bd-9374-f1fadf80ddf6" containerName="mariadb-account-create" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.580747 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="48377824-164a-42bd-9374-f1fadf80ddf6" containerName="mariadb-account-create" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.580904 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="75ae7232-a3d3-43dc-b003-f6e47b5e6868" containerName="mariadb-account-create" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.580949 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="48377824-164a-42bd-9374-f1fadf80ddf6" containerName="mariadb-account-create" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.581539 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.584715 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.584796 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5spks" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.594359 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6mlkv"] Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.594649 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jh85\" (UniqueName: \"kubernetes.io/projected/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-kube-api-access-8jh85\") pod \"barbican-db-sync-6mlkv\" (UID: \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\") " pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.595175 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-combined-ca-bundle\") pod \"barbican-db-sync-6mlkv\" (UID: \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\") " pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.595567 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-db-sync-config-data\") pod \"barbican-db-sync-6mlkv\" (UID: \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\") " pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.658270 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.697237 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-db-sync-config-data\") pod \"barbican-db-sync-6mlkv\" (UID: \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\") " pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.697306 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jh85\" (UniqueName: \"kubernetes.io/projected/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-kube-api-access-8jh85\") pod \"barbican-db-sync-6mlkv\" (UID: \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\") " pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.697428 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-combined-ca-bundle\") pod \"barbican-db-sync-6mlkv\" (UID: \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\") " pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.697858 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-tc29m"] Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.700235 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.701636 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-w28m5" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.702437 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.703467 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.704003 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-combined-ca-bundle\") pod \"barbican-db-sync-6mlkv\" (UID: \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\") " pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.721557 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.726469 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-db-sync-config-data\") pod \"barbican-db-sync-6mlkv\" (UID: \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\") " pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.776251 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jh85\" (UniqueName: \"kubernetes.io/projected/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-kube-api-access-8jh85\") pod \"barbican-db-sync-6mlkv\" (UID: \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\") " pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.776328 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tc29m"] Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.904757 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-combined-ca-bundle\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.904861 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-scripts\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.904939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9ead71f-c58e-4634-96f9-81c9b165e24c-etc-machine-id\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.905007 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbx2b\" (UniqueName: \"kubernetes.io/projected/d9ead71f-c58e-4634-96f9-81c9b165e24c-kube-api-access-mbx2b\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.905059 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-db-sync-config-data\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.905084 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-config-data\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.927901 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:27 crc kubenswrapper[4749]: I1001 13:25:27.971066 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5j2w8" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.008641 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbx2b\" (UniqueName: \"kubernetes.io/projected/d9ead71f-c58e-4634-96f9-81c9b165e24c-kube-api-access-mbx2b\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.008971 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-db-sync-config-data\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.008997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-config-data\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.009053 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-combined-ca-bundle\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.012066 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-scripts\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.012191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9ead71f-c58e-4634-96f9-81c9b165e24c-etc-machine-id\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.015324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9ead71f-c58e-4634-96f9-81c9b165e24c-etc-machine-id\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.031313 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-db-sync-config-data\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.033935 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-config-data\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.034523 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.034660 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-combined-ca-bundle\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.052347 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbx2b\" (UniqueName: \"kubernetes.io/projected/d9ead71f-c58e-4634-96f9-81c9b165e24c-kube-api-access-mbx2b\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.053291 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-scripts\") pod \"cinder-db-sync-tc29m\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.120105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-combined-ca-bundle\") pod \"9ea02603-ed44-4faa-ae1e-37cf61162fde\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.120207 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ea02603-ed44-4faa-ae1e-37cf61162fde-logs\") pod \"9ea02603-ed44-4faa-ae1e-37cf61162fde\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.120282 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-fernet-keys\") pod \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.120319 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-scripts\") pod \"9ea02603-ed44-4faa-ae1e-37cf61162fde\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.120341 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-config-data\") pod \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.120369 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-credential-keys\") pod \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.120387 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-scripts\") pod \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.120403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxj4x\" (UniqueName: \"kubernetes.io/projected/f32f512f-9aba-40b8-9f16-bcb1151eab3f-kube-api-access-pxj4x\") pod \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.120427 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-config-data\") pod \"9ea02603-ed44-4faa-ae1e-37cf61162fde\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.120443 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-combined-ca-bundle\") pod \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\" (UID: \"f32f512f-9aba-40b8-9f16-bcb1151eab3f\") " Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.120478 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8pmv\" (UniqueName: \"kubernetes.io/projected/9ea02603-ed44-4faa-ae1e-37cf61162fde-kube-api-access-k8pmv\") pod \"9ea02603-ed44-4faa-ae1e-37cf61162fde\" (UID: \"9ea02603-ed44-4faa-ae1e-37cf61162fde\") " Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.123647 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ea02603-ed44-4faa-ae1e-37cf61162fde-kube-api-access-k8pmv" (OuterVolumeSpecName: "kube-api-access-k8pmv") pod "9ea02603-ed44-4faa-ae1e-37cf61162fde" (UID: "9ea02603-ed44-4faa-ae1e-37cf61162fde"). InnerVolumeSpecName "kube-api-access-k8pmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.136431 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ea02603-ed44-4faa-ae1e-37cf61162fde-logs" (OuterVolumeSpecName: "logs") pod "9ea02603-ed44-4faa-ae1e-37cf61162fde" (UID: "9ea02603-ed44-4faa-ae1e-37cf61162fde"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.152366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f32f512f-9aba-40b8-9f16-bcb1151eab3f" (UID: "f32f512f-9aba-40b8-9f16-bcb1151eab3f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.155528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f32f512f-9aba-40b8-9f16-bcb1151eab3f" (UID: "f32f512f-9aba-40b8-9f16-bcb1151eab3f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.160314 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-scripts" (OuterVolumeSpecName: "scripts") pod "9ea02603-ed44-4faa-ae1e-37cf61162fde" (UID: "9ea02603-ed44-4faa-ae1e-37cf61162fde"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.160657 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-scripts" (OuterVolumeSpecName: "scripts") pod "f32f512f-9aba-40b8-9f16-bcb1151eab3f" (UID: "f32f512f-9aba-40b8-9f16-bcb1151eab3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.167087 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ea02603-ed44-4faa-ae1e-37cf61162fde" (UID: "9ea02603-ed44-4faa-ae1e-37cf61162fde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.171929 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f32f512f-9aba-40b8-9f16-bcb1151eab3f-kube-api-access-pxj4x" (OuterVolumeSpecName: "kube-api-access-pxj4x") pod "f32f512f-9aba-40b8-9f16-bcb1151eab3f" (UID: "f32f512f-9aba-40b8-9f16-bcb1151eab3f"). InnerVolumeSpecName "kube-api-access-pxj4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.211276 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-config-data" (OuterVolumeSpecName: "config-data") pod "9ea02603-ed44-4faa-ae1e-37cf61162fde" (UID: "9ea02603-ed44-4faa-ae1e-37cf61162fde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.218413 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f32f512f-9aba-40b8-9f16-bcb1151eab3f" (UID: "f32f512f-9aba-40b8-9f16-bcb1151eab3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.224691 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.224719 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ea02603-ed44-4faa-ae1e-37cf61162fde-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.224729 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.224737 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.224745 4749 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.224754 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.224762 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxj4x\" (UniqueName: \"kubernetes.io/projected/f32f512f-9aba-40b8-9f16-bcb1151eab3f-kube-api-access-pxj4x\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.224773 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ea02603-ed44-4faa-ae1e-37cf61162fde-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.224781 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.224789 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8pmv\" (UniqueName: \"kubernetes.io/projected/9ea02603-ed44-4faa-ae1e-37cf61162fde-kube-api-access-k8pmv\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.248912 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tc29m" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.256482 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-config-data" (OuterVolumeSpecName: "config-data") pod "f32f512f-9aba-40b8-9f16-bcb1151eab3f" (UID: "f32f512f-9aba-40b8-9f16-bcb1151eab3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.339134 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f32f512f-9aba-40b8-9f16-bcb1151eab3f-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.442110 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e52665-f55b-4137-82ec-7ab7392bca61","Type":"ContainerStarted","Data":"5ca1278161feea90d77b20382284269477a0f48da0fd9a28f3f40443811b3cac"} Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.450574 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8zmhd" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.450587 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8zmhd" event={"ID":"f32f512f-9aba-40b8-9f16-bcb1151eab3f","Type":"ContainerDied","Data":"d71056b9bbad7d19ef12622c0a9c4c06d0fbc74e24aeae388af50a03ce3567f6"} Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.450632 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d71056b9bbad7d19ef12622c0a9c4c06d0fbc74e24aeae388af50a03ce3567f6" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.463460 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"fab3f7e4-b73c-4e23-839f-719e5b8ca205","Type":"ContainerStarted","Data":"a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7"} Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.466489 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5j2w8" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.478579 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5j2w8" event={"ID":"9ea02603-ed44-4faa-ae1e-37cf61162fde","Type":"ContainerDied","Data":"71421e79c0324bf67f0b678a0ad34a7d020a713a53123b01750ca10a9ae5aa2d"} Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.478623 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71421e79c0324bf67f0b678a0ad34a7d020a713a53123b01750ca10a9ae5aa2d" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.511840 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.573181 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.605768 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6mlkv"] Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.631320 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ffbb6dc5b-8kwbn"] Oct 01 13:25:28 crc kubenswrapper[4749]: E1001 13:25:28.631953 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea02603-ed44-4faa-ae1e-37cf61162fde" containerName="placement-db-sync" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.631966 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea02603-ed44-4faa-ae1e-37cf61162fde" containerName="placement-db-sync" Oct 01 13:25:28 crc kubenswrapper[4749]: E1001 13:25:28.631997 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f32f512f-9aba-40b8-9f16-bcb1151eab3f" containerName="keystone-bootstrap" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.632004 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f32f512f-9aba-40b8-9f16-bcb1151eab3f" containerName="keystone-bootstrap" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.632190 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f32f512f-9aba-40b8-9f16-bcb1151eab3f" containerName="keystone-bootstrap" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.632225 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ea02603-ed44-4faa-ae1e-37cf61162fde" containerName="placement-db-sync" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.634576 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.642334 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.642553 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hp8k2" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.642667 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.642858 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.642995 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.647839 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ffbb6dc5b-8kwbn"] Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.699824 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-658b64fcb-k2w2c"] Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.701284 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.710478 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-658b64fcb-k2w2c"] Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.712891 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.713228 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.713611 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.714913 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.714958 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.716125 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-776bb" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.753986 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-config-data\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.754082 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d400dce-67f7-4e74-b2b9-85f0302a3e43-logs\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.754129 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-internal-tls-certs\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.754210 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-public-tls-certs\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.754248 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmch7\" (UniqueName: \"kubernetes.io/projected/5d400dce-67f7-4e74-b2b9-85f0302a3e43-kube-api-access-fmch7\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.754266 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-combined-ca-bundle\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.754287 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-scripts\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.804733 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tc29m"] Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.855847 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-public-tls-certs\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.855888 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-credential-keys\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.855907 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-config-data\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.855931 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmch7\" (UniqueName: \"kubernetes.io/projected/5d400dce-67f7-4e74-b2b9-85f0302a3e43-kube-api-access-fmch7\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.855947 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-combined-ca-bundle\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.855973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-scripts\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.855990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-fernet-keys\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.856014 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvjkl\" (UniqueName: \"kubernetes.io/projected/5b73cc62-6695-480f-90cc-8d1f4b5993b3-kube-api-access-wvjkl\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.856030 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-scripts\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.856048 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-public-tls-certs\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.856065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-config-data\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.856094 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-combined-ca-bundle\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.856127 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d400dce-67f7-4e74-b2b9-85f0302a3e43-logs\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.856164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-internal-tls-certs\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.856195 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-internal-tls-certs\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.858781 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d400dce-67f7-4e74-b2b9-85f0302a3e43-logs\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.865626 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-internal-tls-certs\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.865740 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-combined-ca-bundle\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.866174 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-config-data\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.866510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-public-tls-certs\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.868570 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d400dce-67f7-4e74-b2b9-85f0302a3e43-scripts\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.876870 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmch7\" (UniqueName: \"kubernetes.io/projected/5d400dce-67f7-4e74-b2b9-85f0302a3e43-kube-api-access-fmch7\") pod \"placement-ffbb6dc5b-8kwbn\" (UID: \"5d400dce-67f7-4e74-b2b9-85f0302a3e43\") " pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.957371 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-credential-keys\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.957414 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-config-data\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.957443 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-fernet-keys\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.957468 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvjkl\" (UniqueName: \"kubernetes.io/projected/5b73cc62-6695-480f-90cc-8d1f4b5993b3-kube-api-access-wvjkl\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.957484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-scripts\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.957504 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-public-tls-certs\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.957535 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-combined-ca-bundle\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.960507 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-internal-tls-certs\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.962162 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-config-data\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.962624 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-public-tls-certs\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.964193 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-combined-ca-bundle\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.964370 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-fernet-keys\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.984042 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-credential-keys\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.984811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-scripts\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.985279 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b73cc62-6695-480f-90cc-8d1f4b5993b3-internal-tls-certs\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:28 crc kubenswrapper[4749]: I1001 13:25:28.987660 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvjkl\" (UniqueName: \"kubernetes.io/projected/5b73cc62-6695-480f-90cc-8d1f4b5993b3-kube-api-access-wvjkl\") pod \"keystone-658b64fcb-k2w2c\" (UID: \"5b73cc62-6695-480f-90cc-8d1f4b5993b3\") " pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:29 crc kubenswrapper[4749]: I1001 13:25:29.002124 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:29 crc kubenswrapper[4749]: I1001 13:25:29.100050 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:29 crc kubenswrapper[4749]: I1001 13:25:29.506345 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tc29m" event={"ID":"d9ead71f-c58e-4634-96f9-81c9b165e24c","Type":"ContainerStarted","Data":"0e0920003169871c235816c60069d2bc2cf2e0d63df5edcf1c6c2b5c1184685a"} Oct 01 13:25:29 crc kubenswrapper[4749]: I1001 13:25:29.525266 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6mlkv" event={"ID":"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78","Type":"ContainerStarted","Data":"715cea8c50c24d08f0152c4b232e3780d93309ce310c0ace6923f285c76df6cf"} Oct 01 13:25:29 crc kubenswrapper[4749]: I1001 13:25:29.589015 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ffbb6dc5b-8kwbn"] Oct 01 13:25:29 crc kubenswrapper[4749]: W1001 13:25:29.611458 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d400dce_67f7_4e74_b2b9_85f0302a3e43.slice/crio-cff3f22e6b966ca92306fb87d540e60dedb70e56b4945414bdd311b48d37ee48 WatchSource:0}: Error finding container cff3f22e6b966ca92306fb87d540e60dedb70e56b4945414bdd311b48d37ee48: Status 404 returned error can't find the container with id cff3f22e6b966ca92306fb87d540e60dedb70e56b4945414bdd311b48d37ee48 Oct 01 13:25:29 crc kubenswrapper[4749]: I1001 13:25:29.786151 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-658b64fcb-k2w2c"] Oct 01 13:25:29 crc kubenswrapper[4749]: W1001 13:25:29.813260 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b73cc62_6695_480f_90cc_8d1f4b5993b3.slice/crio-7faa36d60ca15307f5476439769f1c8d954e4be8e7d30fff50f569f8d52dc3a5 WatchSource:0}: Error finding container 7faa36d60ca15307f5476439769f1c8d954e4be8e7d30fff50f569f8d52dc3a5: Status 404 returned error can't find the container with id 7faa36d60ca15307f5476439769f1c8d954e4be8e7d30fff50f569f8d52dc3a5 Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.238101 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.541706 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ffbb6dc5b-8kwbn" event={"ID":"5d400dce-67f7-4e74-b2b9-85f0302a3e43","Type":"ContainerStarted","Data":"fb5811859f8b5c7a4b4a5876196fb35a0c4adccdbd9ab99ff7f0d16b84a3bc18"} Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.542010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ffbb6dc5b-8kwbn" event={"ID":"5d400dce-67f7-4e74-b2b9-85f0302a3e43","Type":"ContainerStarted","Data":"0dbaf9964bda28c4782e7dbf8d6fdf077b062b07cff7f00f0b341ebe0a7715e4"} Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.542022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ffbb6dc5b-8kwbn" event={"ID":"5d400dce-67f7-4e74-b2b9-85f0302a3e43","Type":"ContainerStarted","Data":"cff3f22e6b966ca92306fb87d540e60dedb70e56b4945414bdd311b48d37ee48"} Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.543182 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.543207 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.550665 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="dd95f834-908d-4a7c-8552-b93425ae5dd8" containerName="watcher-applier" containerID="cri-o://527dd8d48c1e873adbfbaf3ed96221450695094899745892bd5f87a9ba412828" gracePeriod=30 Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.551057 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-658b64fcb-k2w2c" event={"ID":"5b73cc62-6695-480f-90cc-8d1f4b5993b3","Type":"ContainerStarted","Data":"b6c75462cc83a15332c81fb31d2762c7ffd8b231fe77428645379bebdb5f7794"} Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.551087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-658b64fcb-k2w2c" event={"ID":"5b73cc62-6695-480f-90cc-8d1f4b5993b3","Type":"ContainerStarted","Data":"7faa36d60ca15307f5476439769f1c8d954e4be8e7d30fff50f569f8d52dc3a5"} Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.551267 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.573336 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-ffbb6dc5b-8kwbn" podStartSLOduration=2.57331958 podStartE2EDuration="2.57331958s" podCreationTimestamp="2025-10-01 13:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:25:30.5668972 +0000 UTC m=+1190.620882119" watchObservedRunningTime="2025-10-01 13:25:30.57331958 +0000 UTC m=+1190.627304479" Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.590834 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-658b64fcb-k2w2c" podStartSLOduration=2.5908136280000003 podStartE2EDuration="2.590813628s" podCreationTimestamp="2025-10-01 13:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:25:30.590383866 +0000 UTC m=+1190.644368775" watchObservedRunningTime="2025-10-01 13:25:30.590813628 +0000 UTC m=+1190.644798537" Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.675577 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 01 13:25:30 crc kubenswrapper[4749]: I1001 13:25:30.695990 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 01 13:25:31 crc kubenswrapper[4749]: I1001 13:25:31.566569 4749 generic.go:334] "Generic (PLEG): container finished" podID="7bcae573-48fa-4920-8b8f-4df57d4c5375" containerID="7ad4fe9694512426bf33f9e9fffb584b0eb79d993f651695d89fa084a7ae5d2c" exitCode=0 Oct 01 13:25:31 crc kubenswrapper[4749]: I1001 13:25:31.566660 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-d55n6" event={"ID":"7bcae573-48fa-4920-8b8f-4df57d4c5375","Type":"ContainerDied","Data":"7ad4fe9694512426bf33f9e9fffb584b0eb79d993f651695d89fa084a7ae5d2c"} Oct 01 13:25:31 crc kubenswrapper[4749]: I1001 13:25:31.572860 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 01 13:25:32 crc kubenswrapper[4749]: I1001 13:25:32.579522 4749 generic.go:334] "Generic (PLEG): container finished" podID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerID="a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7" exitCode=1 Oct 01 13:25:32 crc kubenswrapper[4749]: I1001 13:25:32.579717 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"fab3f7e4-b73c-4e23-839f-719e5b8ca205","Type":"ContainerDied","Data":"a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7"} Oct 01 13:25:32 crc kubenswrapper[4749]: I1001 13:25:32.579752 4749 scope.go:117] "RemoveContainer" containerID="7eb16ed1d2c410890330eb87312d4001c4e15355f7efe7f1e591cd47470841b6" Oct 01 13:25:32 crc kubenswrapper[4749]: I1001 13:25:32.580661 4749 scope.go:117] "RemoveContainer" containerID="a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7" Oct 01 13:25:32 crc kubenswrapper[4749]: E1001 13:25:32.580870 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(fab3f7e4-b73c-4e23-839f-719e5b8ca205)\"" pod="openstack/watcher-decision-engine-0" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" Oct 01 13:25:32 crc kubenswrapper[4749]: E1001 13:25:32.667304 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="527dd8d48c1e873adbfbaf3ed96221450695094899745892bd5f87a9ba412828" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 01 13:25:32 crc kubenswrapper[4749]: E1001 13:25:32.672430 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="527dd8d48c1e873adbfbaf3ed96221450695094899745892bd5f87a9ba412828" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 01 13:25:32 crc kubenswrapper[4749]: E1001 13:25:32.677419 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="527dd8d48c1e873adbfbaf3ed96221450695094899745892bd5f87a9ba412828" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 01 13:25:32 crc kubenswrapper[4749]: E1001 13:25:32.677501 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="dd95f834-908d-4a7c-8552-b93425ae5dd8" containerName="watcher-applier" Oct 01 13:25:33 crc kubenswrapper[4749]: I1001 13:25:33.602912 4749 generic.go:334] "Generic (PLEG): container finished" podID="dd95f834-908d-4a7c-8552-b93425ae5dd8" containerID="527dd8d48c1e873adbfbaf3ed96221450695094899745892bd5f87a9ba412828" exitCode=0 Oct 01 13:25:33 crc kubenswrapper[4749]: I1001 13:25:33.603099 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"dd95f834-908d-4a7c-8552-b93425ae5dd8","Type":"ContainerDied","Data":"527dd8d48c1e873adbfbaf3ed96221450695094899745892bd5f87a9ba412828"} Oct 01 13:25:34 crc kubenswrapper[4749]: I1001 13:25:34.235358 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:25:34 crc kubenswrapper[4749]: I1001 13:25:34.235781 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerName="watcher-api-log" containerID="cri-o://91f8c3ba19c8a5c8bb236248a8de1ad52b586be4364bd154c8dc4552652bb22c" gracePeriod=30 Oct 01 13:25:34 crc kubenswrapper[4749]: I1001 13:25:34.235846 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerName="watcher-api" containerID="cri-o://df69449cb75afde5077944a7c210733d4e770a4ec99aa4e72823174395f223c2" gracePeriod=30 Oct 01 13:25:34 crc kubenswrapper[4749]: I1001 13:25:34.613229 4749 generic.go:334] "Generic (PLEG): container finished" podID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerID="91f8c3ba19c8a5c8bb236248a8de1ad52b586be4364bd154c8dc4552652bb22c" exitCode=143 Oct 01 13:25:34 crc kubenswrapper[4749]: I1001 13:25:34.613270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a825f7ae-11ba-4e63-8996-12e4b7b6216b","Type":"ContainerDied","Data":"91f8c3ba19c8a5c8bb236248a8de1ad52b586be4364bd154c8dc4552652bb22c"} Oct 01 13:25:35 crc kubenswrapper[4749]: I1001 13:25:35.629166 4749 generic.go:334] "Generic (PLEG): container finished" podID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerID="df69449cb75afde5077944a7c210733d4e770a4ec99aa4e72823174395f223c2" exitCode=0 Oct 01 13:25:35 crc kubenswrapper[4749]: I1001 13:25:35.629265 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a825f7ae-11ba-4e63-8996-12e4b7b6216b","Type":"ContainerDied","Data":"df69449cb75afde5077944a7c210733d4e770a4ec99aa4e72823174395f223c2"} Oct 01 13:25:35 crc kubenswrapper[4749]: I1001 13:25:35.675572 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": dial tcp 10.217.0.162:9322: connect: connection refused" Oct 01 13:25:35 crc kubenswrapper[4749]: I1001 13:25:35.675586 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.162:9322/\": dial tcp 10.217.0.162:9322: connect: connection refused" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.524284 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.560771 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-d55n6" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.634820 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ct5n\" (UniqueName: \"kubernetes.io/projected/dd95f834-908d-4a7c-8552-b93425ae5dd8-kube-api-access-8ct5n\") pod \"dd95f834-908d-4a7c-8552-b93425ae5dd8\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.634952 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-combined-ca-bundle\") pod \"7bcae573-48fa-4920-8b8f-4df57d4c5375\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.634998 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-config-data\") pod \"7bcae573-48fa-4920-8b8f-4df57d4c5375\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.635067 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8l5g\" (UniqueName: \"kubernetes.io/projected/7bcae573-48fa-4920-8b8f-4df57d4c5375-kube-api-access-p8l5g\") pod \"7bcae573-48fa-4920-8b8f-4df57d4c5375\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.635097 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd95f834-908d-4a7c-8552-b93425ae5dd8-combined-ca-bundle\") pod \"dd95f834-908d-4a7c-8552-b93425ae5dd8\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.635125 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-db-sync-config-data\") pod \"7bcae573-48fa-4920-8b8f-4df57d4c5375\" (UID: \"7bcae573-48fa-4920-8b8f-4df57d4c5375\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.635180 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd95f834-908d-4a7c-8552-b93425ae5dd8-config-data\") pod \"dd95f834-908d-4a7c-8552-b93425ae5dd8\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.635236 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd95f834-908d-4a7c-8552-b93425ae5dd8-logs\") pod \"dd95f834-908d-4a7c-8552-b93425ae5dd8\" (UID: \"dd95f834-908d-4a7c-8552-b93425ae5dd8\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.636046 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd95f834-908d-4a7c-8552-b93425ae5dd8-logs" (OuterVolumeSpecName: "logs") pod "dd95f834-908d-4a7c-8552-b93425ae5dd8" (UID: "dd95f834-908d-4a7c-8552-b93425ae5dd8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.643366 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bcae573-48fa-4920-8b8f-4df57d4c5375-kube-api-access-p8l5g" (OuterVolumeSpecName: "kube-api-access-p8l5g") pod "7bcae573-48fa-4920-8b8f-4df57d4c5375" (UID: "7bcae573-48fa-4920-8b8f-4df57d4c5375"). InnerVolumeSpecName "kube-api-access-p8l5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.652493 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7bcae573-48fa-4920-8b8f-4df57d4c5375" (UID: "7bcae573-48fa-4920-8b8f-4df57d4c5375"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.664505 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd95f834-908d-4a7c-8552-b93425ae5dd8-kube-api-access-8ct5n" (OuterVolumeSpecName: "kube-api-access-8ct5n") pod "dd95f834-908d-4a7c-8552-b93425ae5dd8" (UID: "dd95f834-908d-4a7c-8552-b93425ae5dd8"). InnerVolumeSpecName "kube-api-access-8ct5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.664632 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7bcae573-48fa-4920-8b8f-4df57d4c5375" (UID: "7bcae573-48fa-4920-8b8f-4df57d4c5375"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.672775 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.672775 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"dd95f834-908d-4a7c-8552-b93425ae5dd8","Type":"ContainerDied","Data":"b08840ab2a55da2ea2039f3b9f1460d477c0bfc283b0afc1e67d527c46c1dce1"} Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.672850 4749 scope.go:117] "RemoveContainer" containerID="527dd8d48c1e873adbfbaf3ed96221450695094899745892bd5f87a9ba412828" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.675197 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-d55n6" event={"ID":"7bcae573-48fa-4920-8b8f-4df57d4c5375","Type":"ContainerDied","Data":"41f2867c999513ee36702ba407a9edb9d82cd0bda93f1d3e9ef23ab73ad63b97"} Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.675248 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41f2867c999513ee36702ba407a9edb9d82cd0bda93f1d3e9ef23ab73ad63b97" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.675302 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-d55n6" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.739647 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd95f834-908d-4a7c-8552-b93425ae5dd8-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.739683 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ct5n\" (UniqueName: \"kubernetes.io/projected/dd95f834-908d-4a7c-8552-b93425ae5dd8-kube-api-access-8ct5n\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.739693 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.739701 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8l5g\" (UniqueName: \"kubernetes.io/projected/7bcae573-48fa-4920-8b8f-4df57d4c5375-kube-api-access-p8l5g\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.739709 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.760094 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.801859 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd95f834-908d-4a7c-8552-b93425ae5dd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd95f834-908d-4a7c-8552-b93425ae5dd8" (UID: "dd95f834-908d-4a7c-8552-b93425ae5dd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.841763 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-custom-prometheus-ca\") pod \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.841838 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-combined-ca-bundle\") pod \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.841977 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5h7d\" (UniqueName: \"kubernetes.io/projected/a825f7ae-11ba-4e63-8996-12e4b7b6216b-kube-api-access-d5h7d\") pod \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.842012 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a825f7ae-11ba-4e63-8996-12e4b7b6216b-logs\") pod \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.842048 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-config-data\") pod \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\" (UID: \"a825f7ae-11ba-4e63-8996-12e4b7b6216b\") " Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.871384 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd95f834-908d-4a7c-8552-b93425ae5dd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.879357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a825f7ae-11ba-4e63-8996-12e4b7b6216b-logs" (OuterVolumeSpecName: "logs") pod "a825f7ae-11ba-4e63-8996-12e4b7b6216b" (UID: "a825f7ae-11ba-4e63-8996-12e4b7b6216b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.914528 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a825f7ae-11ba-4e63-8996-12e4b7b6216b-kube-api-access-d5h7d" (OuterVolumeSpecName: "kube-api-access-d5h7d") pod "a825f7ae-11ba-4e63-8996-12e4b7b6216b" (UID: "a825f7ae-11ba-4e63-8996-12e4b7b6216b"). InnerVolumeSpecName "kube-api-access-d5h7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.918879 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-config-data" (OuterVolumeSpecName: "config-data") pod "7bcae573-48fa-4920-8b8f-4df57d4c5375" (UID: "7bcae573-48fa-4920-8b8f-4df57d4c5375"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.952335 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd95f834-908d-4a7c-8552-b93425ae5dd8-config-data" (OuterVolumeSpecName: "config-data") pod "dd95f834-908d-4a7c-8552-b93425ae5dd8" (UID: "dd95f834-908d-4a7c-8552-b93425ae5dd8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.955864 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a825f7ae-11ba-4e63-8996-12e4b7b6216b" (UID: "a825f7ae-11ba-4e63-8996-12e4b7b6216b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.962430 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "a825f7ae-11ba-4e63-8996-12e4b7b6216b" (UID: "a825f7ae-11ba-4e63-8996-12e4b7b6216b"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.981226 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a825f7ae-11ba-4e63-8996-12e4b7b6216b-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.981694 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bcae573-48fa-4920-8b8f-4df57d4c5375-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.981772 4749 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.981847 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.981923 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd95f834-908d-4a7c-8552-b93425ae5dd8-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:36 crc kubenswrapper[4749]: I1001 13:25:36.981982 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5h7d\" (UniqueName: \"kubernetes.io/projected/a825f7ae-11ba-4e63-8996-12e4b7b6216b-kube-api-access-d5h7d\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.009519 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-config-data" (OuterVolumeSpecName: "config-data") pod "a825f7ae-11ba-4e63-8996-12e4b7b6216b" (UID: "a825f7ae-11ba-4e63-8996-12e4b7b6216b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.073908 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.083970 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a825f7ae-11ba-4e63-8996-12e4b7b6216b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.094280 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.105472 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 01 13:25:37 crc kubenswrapper[4749]: E1001 13:25:37.105821 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerName="watcher-api-log" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.105834 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerName="watcher-api-log" Oct 01 13:25:37 crc kubenswrapper[4749]: E1001 13:25:37.105854 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bcae573-48fa-4920-8b8f-4df57d4c5375" containerName="glance-db-sync" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.105861 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bcae573-48fa-4920-8b8f-4df57d4c5375" containerName="glance-db-sync" Oct 01 13:25:37 crc kubenswrapper[4749]: E1001 13:25:37.105882 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd95f834-908d-4a7c-8552-b93425ae5dd8" containerName="watcher-applier" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.105888 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd95f834-908d-4a7c-8552-b93425ae5dd8" containerName="watcher-applier" Oct 01 13:25:37 crc kubenswrapper[4749]: E1001 13:25:37.105909 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerName="watcher-api" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.105919 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerName="watcher-api" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.106083 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bcae573-48fa-4920-8b8f-4df57d4c5375" containerName="glance-db-sync" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.106091 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerName="watcher-api-log" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.106106 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd95f834-908d-4a7c-8552-b93425ae5dd8" containerName="watcher-applier" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.106120 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" containerName="watcher-api" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.107054 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.111515 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.120811 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.250317 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd95f834-908d-4a7c-8552-b93425ae5dd8" path="/var/lib/kubelet/pods/dd95f834-908d-4a7c-8552-b93425ae5dd8/volumes" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.287201 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60860f07-ba03-4dfb-bb91-2bd68232bc90-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"60860f07-ba03-4dfb-bb91-2bd68232bc90\") " pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.287264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60860f07-ba03-4dfb-bb91-2bd68232bc90-logs\") pod \"watcher-applier-0\" (UID: \"60860f07-ba03-4dfb-bb91-2bd68232bc90\") " pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.287316 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60860f07-ba03-4dfb-bb91-2bd68232bc90-config-data\") pod \"watcher-applier-0\" (UID: \"60860f07-ba03-4dfb-bb91-2bd68232bc90\") " pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.287339 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t56p5\" (UniqueName: \"kubernetes.io/projected/60860f07-ba03-4dfb-bb91-2bd68232bc90-kube-api-access-t56p5\") pod \"watcher-applier-0\" (UID: \"60860f07-ba03-4dfb-bb91-2bd68232bc90\") " pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.389161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60860f07-ba03-4dfb-bb91-2bd68232bc90-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"60860f07-ba03-4dfb-bb91-2bd68232bc90\") " pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.389204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60860f07-ba03-4dfb-bb91-2bd68232bc90-logs\") pod \"watcher-applier-0\" (UID: \"60860f07-ba03-4dfb-bb91-2bd68232bc90\") " pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.389307 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60860f07-ba03-4dfb-bb91-2bd68232bc90-config-data\") pod \"watcher-applier-0\" (UID: \"60860f07-ba03-4dfb-bb91-2bd68232bc90\") " pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.389325 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t56p5\" (UniqueName: \"kubernetes.io/projected/60860f07-ba03-4dfb-bb91-2bd68232bc90-kube-api-access-t56p5\") pod \"watcher-applier-0\" (UID: \"60860f07-ba03-4dfb-bb91-2bd68232bc90\") " pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.389728 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60860f07-ba03-4dfb-bb91-2bd68232bc90-logs\") pod \"watcher-applier-0\" (UID: \"60860f07-ba03-4dfb-bb91-2bd68232bc90\") " pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.393726 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60860f07-ba03-4dfb-bb91-2bd68232bc90-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"60860f07-ba03-4dfb-bb91-2bd68232bc90\") " pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.399780 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60860f07-ba03-4dfb-bb91-2bd68232bc90-config-data\") pod \"watcher-applier-0\" (UID: \"60860f07-ba03-4dfb-bb91-2bd68232bc90\") " pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.404685 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t56p5\" (UniqueName: \"kubernetes.io/projected/60860f07-ba03-4dfb-bb91-2bd68232bc90-kube-api-access-t56p5\") pod \"watcher-applier-0\" (UID: \"60860f07-ba03-4dfb-bb91-2bd68232bc90\") " pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.433196 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.455114 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.455158 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.455797 4749 scope.go:117] "RemoveContainer" containerID="a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7" Oct 01 13:25:37 crc kubenswrapper[4749]: E1001 13:25:37.456015 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(fab3f7e4-b73c-4e23-839f-719e5b8ca205)\"" pod="openstack/watcher-decision-engine-0" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.697077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a825f7ae-11ba-4e63-8996-12e4b7b6216b","Type":"ContainerDied","Data":"e6a0e04386f22018b4e8bfd587272eb4e785b3194d3ebe4f47aa2551d81c90bc"} Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.697404 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.703414 4749 scope.go:117] "RemoveContainer" containerID="df69449cb75afde5077944a7c210733d4e770a4ec99aa4e72823174395f223c2" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.705786 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6mlkv" event={"ID":"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78","Type":"ContainerStarted","Data":"13edf912c6e6920b991a543183e9f3e801bbc7c3b97b97a8a5c2bcf49253aeed"} Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.758762 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.766431 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.778765 4749 scope.go:117] "RemoveContainer" containerID="91f8c3ba19c8a5c8bb236248a8de1ad52b586be4364bd154c8dc4552652bb22c" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.779801 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-6mlkv" podStartSLOduration=3.025581773 podStartE2EDuration="10.779780317s" podCreationTimestamp="2025-10-01 13:25:27 +0000 UTC" firstStartedPulling="2025-10-01 13:25:28.633304105 +0000 UTC m=+1188.687289004" lastFinishedPulling="2025-10-01 13:25:36.387502649 +0000 UTC m=+1196.441487548" observedRunningTime="2025-10-01 13:25:37.775965014 +0000 UTC m=+1197.829949913" watchObservedRunningTime="2025-10-01 13:25:37.779780317 +0000 UTC m=+1197.833765226" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.795126 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.797227 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.799696 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.800004 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.800157 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.823884 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.896683 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f5c595757-9xvdk"] Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.898173 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.907733 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f5c595757-9xvdk"] Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.908010 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-public-tls-certs\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.908063 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696adfa9-0326-4d60-8a2d-c53ee267a249-logs\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.908234 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-config-data\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.908287 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.908386 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwwlb\" (UniqueName: \"kubernetes.io/projected/696adfa9-0326-4d60-8a2d-c53ee267a249-kube-api-access-rwwlb\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.908413 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.909779 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:37 crc kubenswrapper[4749]: I1001 13:25:37.974326 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012196 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-config-data\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012257 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012284 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-ovsdbserver-sb\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012468 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwwlb\" (UniqueName: \"kubernetes.io/projected/696adfa9-0326-4d60-8a2d-c53ee267a249-kube-api-access-rwwlb\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012488 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012536 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-dns-swift-storage-0\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012580 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-dns-svc\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012606 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012629 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqkjz\" (UniqueName: \"kubernetes.io/projected/603bb41c-f663-4085-9036-ab29af4efa15-kube-api-access-kqkjz\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012670 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-config\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-ovsdbserver-nb\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012727 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-public-tls-certs\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.012743 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696adfa9-0326-4d60-8a2d-c53ee267a249-logs\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.015030 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696adfa9-0326-4d60-8a2d-c53ee267a249-logs\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.017887 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.019350 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.020462 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-public-tls-certs\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.021318 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-config-data\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.021661 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696adfa9-0326-4d60-8a2d-c53ee267a249-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.036309 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwwlb\" (UniqueName: \"kubernetes.io/projected/696adfa9-0326-4d60-8a2d-c53ee267a249-kube-api-access-rwwlb\") pod \"watcher-api-0\" (UID: \"696adfa9-0326-4d60-8a2d-c53ee267a249\") " pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.116274 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-ovsdbserver-sb\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.116340 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-dns-swift-storage-0\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.116413 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-dns-svc\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.116470 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqkjz\" (UniqueName: \"kubernetes.io/projected/603bb41c-f663-4085-9036-ab29af4efa15-kube-api-access-kqkjz\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.116539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-config\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.116561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-ovsdbserver-nb\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.117286 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-ovsdbserver-sb\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.119034 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-config\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.119056 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-ovsdbserver-nb\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.119347 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-dns-svc\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.121769 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-dns-swift-storage-0\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.134941 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqkjz\" (UniqueName: \"kubernetes.io/projected/603bb41c-f663-4085-9036-ab29af4efa15-kube-api-access-kqkjz\") pod \"dnsmasq-dns-6f5c595757-9xvdk\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.142457 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.219090 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.619965 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.759785 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"60860f07-ba03-4dfb-bb91-2bd68232bc90","Type":"ContainerStarted","Data":"753ff9a5e2e4e052fee2fa5446367e8b84d5fc32a6b60d5799f0b7328e998a43"} Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.760088 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"60860f07-ba03-4dfb-bb91-2bd68232bc90","Type":"ContainerStarted","Data":"31b43cb412910265dd201e0b2fca8429b86bc6b698c7fa6cf09b3540087f54b5"} Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.808189 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=1.8081700440000001 podStartE2EDuration="1.808170044s" podCreationTimestamp="2025-10-01 13:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:25:38.77395571 +0000 UTC m=+1198.827940619" watchObservedRunningTime="2025-10-01 13:25:38.808170044 +0000 UTC m=+1198.862154943" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.809247 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.810925 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.814588 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-mvrfw" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.815352 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.816334 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.840718 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.874078 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.937895 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-scripts\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.937956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f38ffc0-1f36-4c0d-9829-770095534d91-logs\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.938044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjbxh\" (UniqueName: \"kubernetes.io/projected/6f38ffc0-1f36-4c0d-9829-770095534d91-kube-api-access-tjbxh\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.938084 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-config-data\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.938123 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.938264 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f38ffc0-1f36-4c0d-9829-770095534d91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:38 crc kubenswrapper[4749]: I1001 13:25:38.938520 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.020063 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.021738 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.026529 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.044159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.044245 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f38ffc0-1f36-4c0d-9829-770095534d91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.044355 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.044434 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-scripts\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.044452 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f38ffc0-1f36-4c0d-9829-770095534d91-logs\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.044481 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjbxh\" (UniqueName: \"kubernetes.io/projected/6f38ffc0-1f36-4c0d-9829-770095534d91-kube-api-access-tjbxh\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.044508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-config-data\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.049438 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f38ffc0-1f36-4c0d-9829-770095534d91-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.050256 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f38ffc0-1f36-4c0d-9829-770095534d91-logs\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.050467 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.050637 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-config-data\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.055032 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.061922 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.070107 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjbxh\" (UniqueName: \"kubernetes.io/projected/6f38ffc0-1f36-4c0d-9829-770095534d91-kube-api-access-tjbxh\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.071210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-scripts\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.103385 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.130496 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.146853 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.147517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5k7k\" (UniqueName: \"kubernetes.io/projected/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-kube-api-access-h5k7k\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.147593 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.147648 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-logs\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.147683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.147718 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.147734 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.147807 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.246109 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a825f7ae-11ba-4e63-8996-12e4b7b6216b" path="/var/lib/kubelet/pods/a825f7ae-11ba-4e63-8996-12e4b7b6216b/volumes" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.249783 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-logs\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.249851 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.249888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.249907 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.250007 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.250048 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5k7k\" (UniqueName: \"kubernetes.io/projected/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-kube-api-access-h5k7k\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.250090 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.251126 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-logs\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.252320 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.254565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.255257 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.257505 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.258183 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.280142 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5k7k\" (UniqueName: \"kubernetes.io/projected/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-kube-api-access-h5k7k\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.303894 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:25:39 crc kubenswrapper[4749]: I1001 13:25:39.480800 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:25:40 crc kubenswrapper[4749]: I1001 13:25:40.970450 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:25:41 crc kubenswrapper[4749]: I1001 13:25:41.039800 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:25:41 crc kubenswrapper[4749]: I1001 13:25:41.266332 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-b74b5b846-r84t7" Oct 01 13:25:41 crc kubenswrapper[4749]: I1001 13:25:41.328606 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:25:41 crc kubenswrapper[4749]: I1001 13:25:41.361943 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84f6c74b66-trbb9"] Oct 01 13:25:41 crc kubenswrapper[4749]: I1001 13:25:41.789304 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84f6c74b66-trbb9" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerName="horizon-log" containerID="cri-o://070469cc902a7a3114f0e257a6257c4a6d561f9bd6e5bdf6b9ee86f6304d2e5f" gracePeriod=30 Oct 01 13:25:41 crc kubenswrapper[4749]: I1001 13:25:41.789528 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84f6c74b66-trbb9" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerName="horizon" containerID="cri-o://a94805c30f134bb40119ae76ad5949c8770747149e8665ebf0949b027b1ecb2a" gracePeriod=30 Oct 01 13:25:42 crc kubenswrapper[4749]: I1001 13:25:42.434519 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 01 13:25:42 crc kubenswrapper[4749]: I1001 13:25:42.800708 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerID="a94805c30f134bb40119ae76ad5949c8770747149e8665ebf0949b027b1ecb2a" exitCode=0 Oct 01 13:25:42 crc kubenswrapper[4749]: I1001 13:25:42.800771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84f6c74b66-trbb9" event={"ID":"7ae81866-7b3d-44f4-a4fd-5b49e2223352","Type":"ContainerDied","Data":"a94805c30f134bb40119ae76ad5949c8770747149e8665ebf0949b027b1ecb2a"} Oct 01 13:25:42 crc kubenswrapper[4749]: I1001 13:25:42.802116 4749 generic.go:334] "Generic (PLEG): container finished" podID="e3b316dc-4b38-4e40-bc7f-e8d64a9caa78" containerID="13edf912c6e6920b991a543183e9f3e801bbc7c3b97b97a8a5c2bcf49253aeed" exitCode=0 Oct 01 13:25:42 crc kubenswrapper[4749]: I1001 13:25:42.802142 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6mlkv" event={"ID":"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78","Type":"ContainerDied","Data":"13edf912c6e6920b991a543183e9f3e801bbc7c3b97b97a8a5c2bcf49253aeed"} Oct 01 13:25:44 crc kubenswrapper[4749]: I1001 13:25:44.824259 4749 generic.go:334] "Generic (PLEG): container finished" podID="4f535ba4-1d6d-4103-8764-c324341bffdd" containerID="1e2f818f18cc349d92509baf7334070dc169aafeec21bb02c2074589bdda24e3" exitCode=0 Oct 01 13:25:44 crc kubenswrapper[4749]: I1001 13:25:44.824311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h459j" event={"ID":"4f535ba4-1d6d-4103-8764-c324341bffdd","Type":"ContainerDied","Data":"1e2f818f18cc349d92509baf7334070dc169aafeec21bb02c2074589bdda24e3"} Oct 01 13:25:46 crc kubenswrapper[4749]: I1001 13:25:46.698131 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84f6c74b66-trbb9" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Oct 01 13:25:47 crc kubenswrapper[4749]: I1001 13:25:47.434653 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 01 13:25:47 crc kubenswrapper[4749]: I1001 13:25:47.477595 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 01 13:25:47 crc kubenswrapper[4749]: I1001 13:25:47.910038 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.000775 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h459j" Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.126787 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f535ba4-1d6d-4103-8764-c324341bffdd-config\") pod \"4f535ba4-1d6d-4103-8764-c324341bffdd\" (UID: \"4f535ba4-1d6d-4103-8764-c324341bffdd\") " Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.127004 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f535ba4-1d6d-4103-8764-c324341bffdd-combined-ca-bundle\") pod \"4f535ba4-1d6d-4103-8764-c324341bffdd\" (UID: \"4f535ba4-1d6d-4103-8764-c324341bffdd\") " Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.127024 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzmd7\" (UniqueName: \"kubernetes.io/projected/4f535ba4-1d6d-4103-8764-c324341bffdd-kube-api-access-rzmd7\") pod \"4f535ba4-1d6d-4103-8764-c324341bffdd\" (UID: \"4f535ba4-1d6d-4103-8764-c324341bffdd\") " Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.144769 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f535ba4-1d6d-4103-8764-c324341bffdd-kube-api-access-rzmd7" (OuterVolumeSpecName: "kube-api-access-rzmd7") pod "4f535ba4-1d6d-4103-8764-c324341bffdd" (UID: "4f535ba4-1d6d-4103-8764-c324341bffdd"). InnerVolumeSpecName "kube-api-access-rzmd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.154905 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f535ba4-1d6d-4103-8764-c324341bffdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f535ba4-1d6d-4103-8764-c324341bffdd" (UID: "4f535ba4-1d6d-4103-8764-c324341bffdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.157419 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f535ba4-1d6d-4103-8764-c324341bffdd-config" (OuterVolumeSpecName: "config") pod "4f535ba4-1d6d-4103-8764-c324341bffdd" (UID: "4f535ba4-1d6d-4103-8764-c324341bffdd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.230525 4749 scope.go:117] "RemoveContainer" containerID="a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7" Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.231379 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f535ba4-1d6d-4103-8764-c324341bffdd-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.231412 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f535ba4-1d6d-4103-8764-c324341bffdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.231430 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzmd7\" (UniqueName: \"kubernetes.io/projected/4f535ba4-1d6d-4103-8764-c324341bffdd-kube-api-access-rzmd7\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.391438 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f5c595757-9xvdk"] Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.879822 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h459j" event={"ID":"4f535ba4-1d6d-4103-8764-c324341bffdd","Type":"ContainerDied","Data":"f64a59da9c721ad80826aa0a0c10493bbc90b3d1dbb66679ea531bc5cbcb82a4"} Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.879879 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f64a59da9c721ad80826aa0a0c10493bbc90b3d1dbb66679ea531bc5cbcb82a4" Oct 01 13:25:48 crc kubenswrapper[4749]: I1001 13:25:48.879845 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h459j" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.144095 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f5c595757-9xvdk"] Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.169485 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6955885c6f-b6gb9"] Oct 01 13:25:49 crc kubenswrapper[4749]: E1001 13:25:49.169867 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f535ba4-1d6d-4103-8764-c324341bffdd" containerName="neutron-db-sync" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.169878 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f535ba4-1d6d-4103-8764-c324341bffdd" containerName="neutron-db-sync" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.170046 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f535ba4-1d6d-4103-8764-c324341bffdd" containerName="neutron-db-sync" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.171064 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.202553 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6955885c6f-b6gb9"] Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.250012 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-dns-swift-storage-0\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.250063 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-dns-svc\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.250088 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-ovsdbserver-sb\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.250281 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-ovsdbserver-nb\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.250492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r8n6\" (UniqueName: \"kubernetes.io/projected/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-kube-api-access-7r8n6\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.250598 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-config\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.252123 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-58f698d9cb-ntlhp"] Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.255857 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.256010 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58f698d9cb-ntlhp"] Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.260425 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.260582 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.260596 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.260721 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pxffj" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.352132 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-httpd-config\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.352177 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-config\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.352225 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x542\" (UniqueName: \"kubernetes.io/projected/60010c1d-09a2-4f43-9f80-c896d4d02945-kube-api-access-6x542\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.352269 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r8n6\" (UniqueName: \"kubernetes.io/projected/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-kube-api-access-7r8n6\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.352463 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-combined-ca-bundle\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.352503 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-config\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.352456 4749 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod57aa7c22-d447-48f7-b16b-517c8553dc09"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod57aa7c22-d447-48f7-b16b-517c8553dc09] : Timed out while waiting for systemd to remove kubepods-besteffort-pod57aa7c22_d447_48f7_b16b_517c8553dc09.slice" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.352635 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-dns-swift-storage-0\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.352677 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-dns-svc\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.352706 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-ovsdbserver-sb\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.352729 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-ovndb-tls-certs\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.352817 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-ovsdbserver-nb\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.353776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-ovsdbserver-nb\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.353891 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-dns-swift-storage-0\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.354142 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-ovsdbserver-sb\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.354471 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-dns-svc\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.355701 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-config\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.369520 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r8n6\" (UniqueName: \"kubernetes.io/projected/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-kube-api-access-7r8n6\") pod \"dnsmasq-dns-6955885c6f-b6gb9\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.454859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-combined-ca-bundle\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.454932 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-ovndb-tls-certs\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.454994 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-httpd-config\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.455024 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-config\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.455048 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x542\" (UniqueName: \"kubernetes.io/projected/60010c1d-09a2-4f43-9f80-c896d4d02945-kube-api-access-6x542\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.460562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-ovndb-tls-certs\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.460695 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-httpd-config\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.464614 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-combined-ca-bundle\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.474731 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x542\" (UniqueName: \"kubernetes.io/projected/60010c1d-09a2-4f43-9f80-c896d4d02945-kube-api-access-6x542\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.488210 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-config\") pod \"neutron-58f698d9cb-ntlhp\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.493496 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:25:49 crc kubenswrapper[4749]: I1001 13:25:49.590971 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.423727 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-59d7c7544c-n7mlp"] Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.425727 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.429016 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.429205 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.443846 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59d7c7544c-n7mlp"] Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.492236 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-internal-tls-certs\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.492305 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-combined-ca-bundle\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.492345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-config\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.492369 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qt2b\" (UniqueName: \"kubernetes.io/projected/680ec9d6-ccd3-4417-9919-7412600f23fb-kube-api-access-4qt2b\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.492438 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-ovndb-tls-certs\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.492475 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-public-tls-certs\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.492654 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-httpd-config\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.594605 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-ovndb-tls-certs\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.594684 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-public-tls-certs\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.594738 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-httpd-config\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.594770 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-internal-tls-certs\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.594814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-combined-ca-bundle\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.594844 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-config\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.594866 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qt2b\" (UniqueName: \"kubernetes.io/projected/680ec9d6-ccd3-4417-9919-7412600f23fb-kube-api-access-4qt2b\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.603082 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-ovndb-tls-certs\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.603858 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-httpd-config\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.604962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-public-tls-certs\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.605504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-combined-ca-bundle\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.606879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-internal-tls-certs\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.610923 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/680ec9d6-ccd3-4417-9919-7412600f23fb-config\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.620873 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qt2b\" (UniqueName: \"kubernetes.io/projected/680ec9d6-ccd3-4417-9919-7412600f23fb-kube-api-access-4qt2b\") pod \"neutron-59d7c7544c-n7mlp\" (UID: \"680ec9d6-ccd3-4417-9919-7412600f23fb\") " pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:51 crc kubenswrapper[4749]: I1001 13:25:51.811952 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.483074 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.612859 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-combined-ca-bundle\") pod \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\" (UID: \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\") " Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.613437 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-db-sync-config-data\") pod \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\" (UID: \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\") " Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.613667 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jh85\" (UniqueName: \"kubernetes.io/projected/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-kube-api-access-8jh85\") pod \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\" (UID: \"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78\") " Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.630783 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-kube-api-access-8jh85" (OuterVolumeSpecName: "kube-api-access-8jh85") pod "e3b316dc-4b38-4e40-bc7f-e8d64a9caa78" (UID: "e3b316dc-4b38-4e40-bc7f-e8d64a9caa78"). InnerVolumeSpecName "kube-api-access-8jh85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.632405 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e3b316dc-4b38-4e40-bc7f-e8d64a9caa78" (UID: "e3b316dc-4b38-4e40-bc7f-e8d64a9caa78"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.646680 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3b316dc-4b38-4e40-bc7f-e8d64a9caa78" (UID: "e3b316dc-4b38-4e40-bc7f-e8d64a9caa78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.715876 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.715909 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.715920 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jh85\" (UniqueName: \"kubernetes.io/projected/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78-kube-api-access-8jh85\") on node \"crc\" DevicePath \"\"" Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.948280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"696adfa9-0326-4d60-8a2d-c53ee267a249","Type":"ContainerStarted","Data":"e05c33a99bd943eeb67b27ebbad3426657f7ffbc3234cfc679671cf301d45db1"} Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.951257 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6mlkv" event={"ID":"e3b316dc-4b38-4e40-bc7f-e8d64a9caa78","Type":"ContainerDied","Data":"715cea8c50c24d08f0152c4b232e3780d93309ce310c0ace6923f285c76df6cf"} Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.951321 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="715cea8c50c24d08f0152c4b232e3780d93309ce310c0ace6923f285c76df6cf" Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.951282 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6mlkv" Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.953886 4749 generic.go:334] "Generic (PLEG): container finished" podID="00959eed-1bce-4b1a-9978-e978b9bdb4cf" containerID="66530ca5a8ae469488d794e90dd097c3615d835cd05524f07db2510f26e1b8e0" exitCode=137 Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.953920 4749 generic.go:334] "Generic (PLEG): container finished" podID="00959eed-1bce-4b1a-9978-e978b9bdb4cf" containerID="5c6a4dca9736198a826b2581e2b71aac1596e33d3187fafb784fdac24534fce0" exitCode=137 Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.953943 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d96cdbf7c-fpnt7" event={"ID":"00959eed-1bce-4b1a-9978-e978b9bdb4cf","Type":"ContainerDied","Data":"66530ca5a8ae469488d794e90dd097c3615d835cd05524f07db2510f26e1b8e0"} Oct 01 13:25:52 crc kubenswrapper[4749]: I1001 13:25:52.953975 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d96cdbf7c-fpnt7" event={"ID":"00959eed-1bce-4b1a-9978-e978b9bdb4cf","Type":"ContainerDied","Data":"5c6a4dca9736198a826b2581e2b71aac1596e33d3187fafb784fdac24534fce0"} Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.649774 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5c6dfb5585-x78z5"] Oct 01 13:25:54 crc kubenswrapper[4749]: E1001 13:25:54.652830 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b316dc-4b38-4e40-bc7f-e8d64a9caa78" containerName="barbican-db-sync" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.654287 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b316dc-4b38-4e40-bc7f-e8d64a9caa78" containerName="barbican-db-sync" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.655149 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b316dc-4b38-4e40-bc7f-e8d64a9caa78" containerName="barbican-db-sync" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.664176 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.668293 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-5spks" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.668511 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.668784 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.668882 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5fdf7b5778-vxx8p"] Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.673042 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.679392 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.688670 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5c6dfb5585-x78z5"] Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.732316 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5fdf7b5778-vxx8p"] Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.783878 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6955885c6f-b6gb9"] Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.807837 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68c474b9fc-mn4ts"] Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.809788 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.826036 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68c474b9fc-mn4ts"] Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.858594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8b1d3a9-044c-475f-b86f-7e099e2b1197-config-data-custom\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.858903 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27e333f-57a3-4257-9e49-e03928cfa02d-logs\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.858927 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gsvt\" (UniqueName: \"kubernetes.io/projected/a27e333f-57a3-4257-9e49-e03928cfa02d-kube-api-access-4gsvt\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.858995 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f66wj\" (UniqueName: \"kubernetes.io/projected/c8b1d3a9-044c-475f-b86f-7e099e2b1197-kube-api-access-f66wj\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.859020 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27e333f-57a3-4257-9e49-e03928cfa02d-combined-ca-bundle\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.859060 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a27e333f-57a3-4257-9e49-e03928cfa02d-config-data-custom\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.859093 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8b1d3a9-044c-475f-b86f-7e099e2b1197-logs\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.859113 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b1d3a9-044c-475f-b86f-7e099e2b1197-combined-ca-bundle\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.859133 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27e333f-57a3-4257-9e49-e03928cfa02d-config-data\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.859154 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b1d3a9-044c-475f-b86f-7e099e2b1197-config-data\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960571 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8b1d3a9-044c-475f-b86f-7e099e2b1197-logs\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960624 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b1d3a9-044c-475f-b86f-7e099e2b1197-combined-ca-bundle\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27e333f-57a3-4257-9e49-e03928cfa02d-config-data\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960672 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b1d3a9-044c-475f-b86f-7e099e2b1197-config-data\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960706 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8b1d3a9-044c-475f-b86f-7e099e2b1197-config-data-custom\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960732 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-config\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960755 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27e333f-57a3-4257-9e49-e03928cfa02d-logs\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960772 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gsvt\" (UniqueName: \"kubernetes.io/projected/a27e333f-57a3-4257-9e49-e03928cfa02d-kube-api-access-4gsvt\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-ovsdbserver-sb\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960816 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-dns-svc\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960859 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-ovsdbserver-nb\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960889 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f66wj\" (UniqueName: \"kubernetes.io/projected/c8b1d3a9-044c-475f-b86f-7e099e2b1197-kube-api-access-f66wj\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960918 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-dns-swift-storage-0\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27e333f-57a3-4257-9e49-e03928cfa02d-combined-ca-bundle\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.960964 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbmq7\" (UniqueName: \"kubernetes.io/projected/b34ccf83-0af2-438a-ad7f-c79c6886db75-kube-api-access-hbmq7\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.961007 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a27e333f-57a3-4257-9e49-e03928cfa02d-config-data-custom\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.963058 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a27e333f-57a3-4257-9e49-e03928cfa02d-logs\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.963969 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8b1d3a9-044c-475f-b86f-7e099e2b1197-logs\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.975022 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8b1d3a9-044c-475f-b86f-7e099e2b1197-config-data\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.976980 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a27e333f-57a3-4257-9e49-e03928cfa02d-config-data-custom\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.982409 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27e333f-57a3-4257-9e49-e03928cfa02d-combined-ca-bundle\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.982484 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27e333f-57a3-4257-9e49-e03928cfa02d-config-data\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.982622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c8b1d3a9-044c-475f-b86f-7e099e2b1197-config-data-custom\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.991128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f66wj\" (UniqueName: \"kubernetes.io/projected/c8b1d3a9-044c-475f-b86f-7e099e2b1197-kube-api-access-f66wj\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:54 crc kubenswrapper[4749]: I1001 13:25:54.991905 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gsvt\" (UniqueName: \"kubernetes.io/projected/a27e333f-57a3-4257-9e49-e03928cfa02d-kube-api-access-4gsvt\") pod \"barbican-keystone-listener-5fdf7b5778-vxx8p\" (UID: \"a27e333f-57a3-4257-9e49-e03928cfa02d\") " pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.002063 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.002434 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b1d3a9-044c-475f-b86f-7e099e2b1197-combined-ca-bundle\") pod \"barbican-worker-5c6dfb5585-x78z5\" (UID: \"c8b1d3a9-044c-475f-b86f-7e099e2b1197\") " pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.041887 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5c48c76d4d-blnpd"] Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.044029 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.046741 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.065866 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbmq7\" (UniqueName: \"kubernetes.io/projected/b34ccf83-0af2-438a-ad7f-c79c6886db75-kube-api-access-hbmq7\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.065967 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d658fa13-ec8c-4936-9015-531912d2e050-logs\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.065998 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-combined-ca-bundle\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.066019 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-config-data\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.066046 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-config\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.066084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-ovsdbserver-sb\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.066102 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-dns-svc\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.066131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjwxh\" (UniqueName: \"kubernetes.io/projected/d658fa13-ec8c-4936-9015-531912d2e050-kube-api-access-tjwxh\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.066165 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-ovsdbserver-nb\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.066201 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-dns-swift-storage-0\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.066238 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-config-data-custom\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.067305 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-ovsdbserver-sb\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.067333 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-config\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.067627 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-dns-svc\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.067837 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-ovsdbserver-nb\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.069962 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-dns-swift-storage-0\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.070881 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c48c76d4d-blnpd"] Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.097404 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbmq7\" (UniqueName: \"kubernetes.io/projected/b34ccf83-0af2-438a-ad7f-c79c6886db75-kube-api-access-hbmq7\") pod \"dnsmasq-dns-68c474b9fc-mn4ts\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.131351 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.168259 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjwxh\" (UniqueName: \"kubernetes.io/projected/d658fa13-ec8c-4936-9015-531912d2e050-kube-api-access-tjwxh\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.168364 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-config-data-custom\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.168477 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d658fa13-ec8c-4936-9015-531912d2e050-logs\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.168521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-combined-ca-bundle\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.168552 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-config-data\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.171295 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d658fa13-ec8c-4936-9015-531912d2e050-logs\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.178160 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-config-data-custom\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.178691 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-combined-ca-bundle\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.179770 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-config-data\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.192717 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjwxh\" (UniqueName: \"kubernetes.io/projected/d658fa13-ec8c-4936-9015-531912d2e050-kube-api-access-tjwxh\") pod \"barbican-api-5c48c76d4d-blnpd\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.290802 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5c6dfb5585-x78z5" Oct 01 13:25:55 crc kubenswrapper[4749]: I1001 13:25:55.452905 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:25:56 crc kubenswrapper[4749]: I1001 13:25:56.698292 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84f6c74b66-trbb9" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.455086 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.455160 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.577310 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-677dbd476-b92fx"] Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.581847 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.588560 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.588719 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.597160 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-677dbd476-b92fx"] Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.721285 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-public-tls-certs\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.721569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-config-data\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.721590 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-combined-ca-bundle\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.721617 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e51f972-9f26-4b6b-8213-9261797a1ee0-logs\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.721649 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-internal-tls-certs\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.721686 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-config-data-custom\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.721706 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vg6w\" (UniqueName: \"kubernetes.io/projected/0e51f972-9f26-4b6b-8213-9261797a1ee0-kube-api-access-6vg6w\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.823414 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-public-tls-certs\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.823476 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-config-data\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.823497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-combined-ca-bundle\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.823521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e51f972-9f26-4b6b-8213-9261797a1ee0-logs\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.823555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-internal-tls-certs\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.823594 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-config-data-custom\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.823612 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vg6w\" (UniqueName: \"kubernetes.io/projected/0e51f972-9f26-4b6b-8213-9261797a1ee0-kube-api-access-6vg6w\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.824938 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e51f972-9f26-4b6b-8213-9261797a1ee0-logs\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.830590 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-config-data\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.830844 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-config-data-custom\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.831919 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-combined-ca-bundle\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.837761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-public-tls-certs\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.839709 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e51f972-9f26-4b6b-8213-9261797a1ee0-internal-tls-certs\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.845823 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vg6w\" (UniqueName: \"kubernetes.io/projected/0e51f972-9f26-4b6b-8213-9261797a1ee0-kube-api-access-6vg6w\") pod \"barbican-api-677dbd476-b92fx\" (UID: \"0e51f972-9f26-4b6b-8213-9261797a1ee0\") " pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:57 crc kubenswrapper[4749]: I1001 13:25:57.902919 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:25:58 crc kubenswrapper[4749]: W1001 13:25:58.099463 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod603bb41c_f663_4085_9036_ab29af4efa15.slice/crio-ff1d77ca9d8a313fddd862a53bc691959c60bdce49a5346af1bdb675787ba161 WatchSource:0}: Error finding container ff1d77ca9d8a313fddd862a53bc691959c60bdce49a5346af1bdb675787ba161: Status 404 returned error can't find the container with id ff1d77ca9d8a313fddd862a53bc691959c60bdce49a5346af1bdb675787ba161 Oct 01 13:25:59 crc kubenswrapper[4749]: I1001 13:25:59.035898 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" event={"ID":"603bb41c-f663-4085-9036-ab29af4efa15","Type":"ContainerStarted","Data":"ff1d77ca9d8a313fddd862a53bc691959c60bdce49a5346af1bdb675787ba161"} Oct 01 13:26:01 crc kubenswrapper[4749]: E1001 13:26:01.751123 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 01 13:26:01 crc kubenswrapper[4749]: E1001 13:26:01.751708 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 01 13:26:01 crc kubenswrapper[4749]: E1001 13:26:01.751846 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.30:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbx2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-tc29m_openstack(d9ead71f-c58e-4634-96f9-81c9b165e24c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:26:01 crc kubenswrapper[4749]: E1001 13:26:01.752987 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-tc29m" podUID="d9ead71f-c58e-4634-96f9-81c9b165e24c" Oct 01 13:26:02 crc kubenswrapper[4749]: E1001 13:26:02.069648 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.30:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-tc29m" podUID="d9ead71f-c58e-4634-96f9-81c9b165e24c" Oct 01 13:26:02 crc kubenswrapper[4749]: I1001 13:26:02.146705 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-658b64fcb-k2w2c" Oct 01 13:26:03 crc kubenswrapper[4749]: I1001 13:26:03.099099 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:26:03 crc kubenswrapper[4749]: I1001 13:26:03.131785 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-ffbb6dc5b-8kwbn" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.558398 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.666449 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00959eed-1bce-4b1a-9978-e978b9bdb4cf-scripts\") pod \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.666961 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00959eed-1bce-4b1a-9978-e978b9bdb4cf-horizon-secret-key\") pod \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.667189 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w674x\" (UniqueName: \"kubernetes.io/projected/00959eed-1bce-4b1a-9978-e978b9bdb4cf-kube-api-access-w674x\") pod \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.667293 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00959eed-1bce-4b1a-9978-e978b9bdb4cf-logs\") pod \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.667404 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00959eed-1bce-4b1a-9978-e978b9bdb4cf-config-data\") pod \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\" (UID: \"00959eed-1bce-4b1a-9978-e978b9bdb4cf\") " Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.668379 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00959eed-1bce-4b1a-9978-e978b9bdb4cf-logs" (OuterVolumeSpecName: "logs") pod "00959eed-1bce-4b1a-9978-e978b9bdb4cf" (UID: "00959eed-1bce-4b1a-9978-e978b9bdb4cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.675949 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00959eed-1bce-4b1a-9978-e978b9bdb4cf-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "00959eed-1bce-4b1a-9978-e978b9bdb4cf" (UID: "00959eed-1bce-4b1a-9978-e978b9bdb4cf"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.676060 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00959eed-1bce-4b1a-9978-e978b9bdb4cf-kube-api-access-w674x" (OuterVolumeSpecName: "kube-api-access-w674x") pod "00959eed-1bce-4b1a-9978-e978b9bdb4cf" (UID: "00959eed-1bce-4b1a-9978-e978b9bdb4cf"). InnerVolumeSpecName "kube-api-access-w674x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.711897 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00959eed-1bce-4b1a-9978-e978b9bdb4cf-config-data" (OuterVolumeSpecName: "config-data") pod "00959eed-1bce-4b1a-9978-e978b9bdb4cf" (UID: "00959eed-1bce-4b1a-9978-e978b9bdb4cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.728568 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 01 13:26:04 crc kubenswrapper[4749]: E1001 13:26:04.729951 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00959eed-1bce-4b1a-9978-e978b9bdb4cf" containerName="horizon" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.729971 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="00959eed-1bce-4b1a-9978-e978b9bdb4cf" containerName="horizon" Oct 01 13:26:04 crc kubenswrapper[4749]: E1001 13:26:04.730023 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00959eed-1bce-4b1a-9978-e978b9bdb4cf" containerName="horizon-log" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.730029 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="00959eed-1bce-4b1a-9978-e978b9bdb4cf" containerName="horizon-log" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.730311 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="00959eed-1bce-4b1a-9978-e978b9bdb4cf" containerName="horizon" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.730332 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="00959eed-1bce-4b1a-9978-e978b9bdb4cf" containerName="horizon-log" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.737420 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.748737 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.748975 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.748996 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-899k7" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.750747 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00959eed-1bce-4b1a-9978-e978b9bdb4cf-scripts" (OuterVolumeSpecName: "scripts") pod "00959eed-1bce-4b1a-9978-e978b9bdb4cf" (UID: "00959eed-1bce-4b1a-9978-e978b9bdb4cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.752571 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.769602 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w674x\" (UniqueName: \"kubernetes.io/projected/00959eed-1bce-4b1a-9978-e978b9bdb4cf-kube-api-access-w674x\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.769626 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00959eed-1bce-4b1a-9978-e978b9bdb4cf-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.769635 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00959eed-1bce-4b1a-9978-e978b9bdb4cf-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.769645 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00959eed-1bce-4b1a-9978-e978b9bdb4cf-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.769662 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00959eed-1bce-4b1a-9978-e978b9bdb4cf-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.871251 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2q5b\" (UniqueName: \"kubernetes.io/projected/b55ebe69-1518-428b-9ceb-383de60316cc-kube-api-access-h2q5b\") pod \"openstackclient\" (UID: \"b55ebe69-1518-428b-9ceb-383de60316cc\") " pod="openstack/openstackclient" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.871497 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b55ebe69-1518-428b-9ceb-383de60316cc-openstack-config-secret\") pod \"openstackclient\" (UID: \"b55ebe69-1518-428b-9ceb-383de60316cc\") " pod="openstack/openstackclient" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.871524 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55ebe69-1518-428b-9ceb-383de60316cc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b55ebe69-1518-428b-9ceb-383de60316cc\") " pod="openstack/openstackclient" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.871567 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b55ebe69-1518-428b-9ceb-383de60316cc-openstack-config\") pod \"openstackclient\" (UID: \"b55ebe69-1518-428b-9ceb-383de60316cc\") " pod="openstack/openstackclient" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.973790 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2q5b\" (UniqueName: \"kubernetes.io/projected/b55ebe69-1518-428b-9ceb-383de60316cc-kube-api-access-h2q5b\") pod \"openstackclient\" (UID: \"b55ebe69-1518-428b-9ceb-383de60316cc\") " pod="openstack/openstackclient" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.973857 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b55ebe69-1518-428b-9ceb-383de60316cc-openstack-config-secret\") pod \"openstackclient\" (UID: \"b55ebe69-1518-428b-9ceb-383de60316cc\") " pod="openstack/openstackclient" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.973877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55ebe69-1518-428b-9ceb-383de60316cc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b55ebe69-1518-428b-9ceb-383de60316cc\") " pod="openstack/openstackclient" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.973937 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b55ebe69-1518-428b-9ceb-383de60316cc-openstack-config\") pod \"openstackclient\" (UID: \"b55ebe69-1518-428b-9ceb-383de60316cc\") " pod="openstack/openstackclient" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.988502 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b55ebe69-1518-428b-9ceb-383de60316cc-openstack-config\") pod \"openstackclient\" (UID: \"b55ebe69-1518-428b-9ceb-383de60316cc\") " pod="openstack/openstackclient" Oct 01 13:26:04 crc kubenswrapper[4749]: I1001 13:26:04.991980 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2q5b\" (UniqueName: \"kubernetes.io/projected/b55ebe69-1518-428b-9ceb-383de60316cc-kube-api-access-h2q5b\") pod \"openstackclient\" (UID: \"b55ebe69-1518-428b-9ceb-383de60316cc\") " pod="openstack/openstackclient" Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.054703 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b55ebe69-1518-428b-9ceb-383de60316cc-openstack-config-secret\") pod \"openstackclient\" (UID: \"b55ebe69-1518-428b-9ceb-383de60316cc\") " pod="openstack/openstackclient" Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.054900 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55ebe69-1518-428b-9ceb-383de60316cc-combined-ca-bundle\") pod \"openstackclient\" (UID: \"b55ebe69-1518-428b-9ceb-383de60316cc\") " pod="openstack/openstackclient" Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.094027 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d96cdbf7c-fpnt7" event={"ID":"00959eed-1bce-4b1a-9978-e978b9bdb4cf","Type":"ContainerDied","Data":"f51535870729f017a6f692d292a76cf1eac326a0a301cfebf29c19eea1651302"} Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.094075 4749 scope.go:117] "RemoveContainer" containerID="66530ca5a8ae469488d794e90dd097c3615d835cd05524f07db2510f26e1b8e0" Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.094206 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d96cdbf7c-fpnt7" Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.134655 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d96cdbf7c-fpnt7"] Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.141974 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7d96cdbf7c-fpnt7"] Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.190688 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58f698d9cb-ntlhp"] Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.195959 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 13:26:05 crc kubenswrapper[4749]: E1001 13:26:05.235775 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24:latest" Oct 01 13:26:05 crc kubenswrapper[4749]: E1001 13:26:05.236016 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24:latest,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8qvcs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(24e52665-f55b-4137-82ec-7ab7392bca61): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 13:26:05 crc kubenswrapper[4749]: E1001 13:26:05.237907 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="24e52665-f55b-4137-82ec-7ab7392bca61" Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.242922 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00959eed-1bce-4b1a-9978-e978b9bdb4cf" path="/var/lib/kubelet/pods/00959eed-1bce-4b1a-9978-e978b9bdb4cf/volumes" Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.358318 4749 scope.go:117] "RemoveContainer" containerID="5c6a4dca9736198a826b2581e2b71aac1596e33d3187fafb784fdac24534fce0" Oct 01 13:26:05 crc kubenswrapper[4749]: W1001 13:26:05.364526 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60010c1d_09a2_4f43_9f80_c896d4d02945.slice/crio-783402d68dd0b75d4435548f60b599d91355791d9e97c8a473e1328a2eab77c1 WatchSource:0}: Error finding container 783402d68dd0b75d4435548f60b599d91355791d9e97c8a473e1328a2eab77c1: Status 404 returned error can't find the container with id 783402d68dd0b75d4435548f60b599d91355791d9e97c8a473e1328a2eab77c1 Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.428649 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.443024 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6955885c6f-b6gb9"] Oct 01 13:26:05 crc kubenswrapper[4749]: W1001 13:26:05.479371 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf24660fa_225d_4bf6_a8bb_f4f9df4f24e6.slice/crio-37c64d814b229f9fb6887c56b35d6870106e8fff7556857f217f921460fc157d WatchSource:0}: Error finding container 37c64d814b229f9fb6887c56b35d6870106e8fff7556857f217f921460fc157d: Status 404 returned error can't find the container with id 37c64d814b229f9fb6887c56b35d6870106e8fff7556857f217f921460fc157d Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.781369 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5c6dfb5585-x78z5"] Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.795651 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5fdf7b5778-vxx8p"] Oct 01 13:26:05 crc kubenswrapper[4749]: W1001 13:26:05.828915 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda27e333f_57a3_4257_9e49_e03928cfa02d.slice/crio-432d07deff90d77a620794fbbd154e91df1d177f5e07247cb8828d57852b97a8 WatchSource:0}: Error finding container 432d07deff90d77a620794fbbd154e91df1d177f5e07247cb8828d57852b97a8: Status 404 returned error can't find the container with id 432d07deff90d77a620794fbbd154e91df1d177f5e07247cb8828d57852b97a8 Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.835859 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68c474b9fc-mn4ts"] Oct 01 13:26:05 crc kubenswrapper[4749]: W1001 13:26:05.850577 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb34ccf83_0af2_438a_ad7f_c79c6886db75.slice/crio-b3bf5fe9336ad6bb370c8ab6acdfb3ece14b7b902ce4ba7bddf4c5f3c6bf00a0 WatchSource:0}: Error finding container b3bf5fe9336ad6bb370c8ab6acdfb3ece14b7b902ce4ba7bddf4c5f3c6bf00a0: Status 404 returned error can't find the container with id b3bf5fe9336ad6bb370c8ab6acdfb3ece14b7b902ce4ba7bddf4c5f3c6bf00a0 Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.901043 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c48c76d4d-blnpd"] Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.929049 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-677dbd476-b92fx"] Oct 01 13:26:05 crc kubenswrapper[4749]: I1001 13:26:05.947415 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59d7c7544c-n7mlp"] Oct 01 13:26:05 crc kubenswrapper[4749]: W1001 13:26:05.947740 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e51f972_9f26_4b6b_8213_9261797a1ee0.slice/crio-9cbbbbdf4ccc1152cdc7563fa3a58f41de6c2d684bae2bc8ae7a638adde5fef2 WatchSource:0}: Error finding container 9cbbbbdf4ccc1152cdc7563fa3a58f41de6c2d684bae2bc8ae7a638adde5fef2: Status 404 returned error can't find the container with id 9cbbbbdf4ccc1152cdc7563fa3a58f41de6c2d684bae2bc8ae7a638adde5fef2 Oct 01 13:26:05 crc kubenswrapper[4749]: W1001 13:26:05.956335 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd658fa13_ec8c_4936_9015_531912d2e050.slice/crio-fd3df1bfa21272382b244cc320e9f6e922ffe1c1e42e8be6f586487e918cdac8 WatchSource:0}: Error finding container fd3df1bfa21272382b244cc320e9f6e922ffe1c1e42e8be6f586487e918cdac8: Status 404 returned error can't find the container with id fd3df1bfa21272382b244cc320e9f6e922ffe1c1e42e8be6f586487e918cdac8 Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.056501 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.115792 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c48c76d4d-blnpd" event={"ID":"d658fa13-ec8c-4936-9015-531912d2e050","Type":"ContainerStarted","Data":"fd3df1bfa21272382b244cc320e9f6e922ffe1c1e42e8be6f586487e918cdac8"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.120388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" event={"ID":"b34ccf83-0af2-438a-ad7f-c79c6886db75","Type":"ContainerStarted","Data":"b3bf5fe9336ad6bb370c8ab6acdfb3ece14b7b902ce4ba7bddf4c5f3c6bf00a0"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.127005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d7c7544c-n7mlp" event={"ID":"680ec9d6-ccd3-4417-9919-7412600f23fb","Type":"ContainerStarted","Data":"71287c39a2e7409d9aafd8428095ba51e6013ef4beef51f6858a749f53908956"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.129103 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c6dfb5585-x78z5" event={"ID":"c8b1d3a9-044c-475f-b86f-7e099e2b1197","Type":"ContainerStarted","Data":"91a7e2df5cf2ebc331bdf655f47ee5153ee14b97be6c584bc9d0d77204fd1b4a"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.130811 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"fab3f7e4-b73c-4e23-839f-719e5b8ca205","Type":"ContainerStarted","Data":"8a542d090d53c6a6a4483dcf44c10f45e9bcc3777789339ef819c7139b20d0ee"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.134713 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f698d9cb-ntlhp" event={"ID":"60010c1d-09a2-4f43-9f80-c896d4d02945","Type":"ContainerStarted","Data":"f8206c69be3f6ecbb17fa47a68cfad645dc99620b381ea0e0e35641265e10873"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.134748 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f698d9cb-ntlhp" event={"ID":"60010c1d-09a2-4f43-9f80-c896d4d02945","Type":"ContainerStarted","Data":"783402d68dd0b75d4435548f60b599d91355791d9e97c8a473e1328a2eab77c1"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.136405 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-677dbd476-b92fx" event={"ID":"0e51f972-9f26-4b6b-8213-9261797a1ee0","Type":"ContainerStarted","Data":"9cbbbbdf4ccc1152cdc7563fa3a58f41de6c2d684bae2bc8ae7a638adde5fef2"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.138082 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b55ebe69-1518-428b-9ceb-383de60316cc","Type":"ContainerStarted","Data":"459f9de3bd5edf677db290391bb299b04e259291f930e15aa6360067e9bfd767"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.152014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" event={"ID":"a27e333f-57a3-4257-9e49-e03928cfa02d","Type":"ContainerStarted","Data":"432d07deff90d77a620794fbbd154e91df1d177f5e07247cb8828d57852b97a8"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.153124 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" event={"ID":"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6","Type":"ContainerStarted","Data":"37c64d814b229f9fb6887c56b35d6870106e8fff7556857f217f921460fc157d"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.157745 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"696adfa9-0326-4d60-8a2d-c53ee267a249","Type":"ContainerStarted","Data":"e9e6ae040629eef34ee274693202d2da9d259bf5f862fe958690748041863bcd"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.158749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f38ffc0-1f36-4c0d-9829-770095534d91","Type":"ContainerStarted","Data":"e8940bcda7db698e8b0c84de7bcc49e9d3a29942f9ae8004653266e140afd8e4"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.160036 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" event={"ID":"603bb41c-f663-4085-9036-ab29af4efa15","Type":"ContainerDied","Data":"ea772389df26868defaaa91ece26edc48c6dd3374e039c9f3dfddeb841cd7fee"} Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.159951 4749 generic.go:334] "Generic (PLEG): container finished" podID="603bb41c-f663-4085-9036-ab29af4efa15" containerID="ea772389df26868defaaa91ece26edc48c6dd3374e039c9f3dfddeb841cd7fee" exitCode=0 Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.160689 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24e52665-f55b-4137-82ec-7ab7392bca61" containerName="ceilometer-notification-agent" containerID="cri-o://88a4616d6bd1607daba64394bdfadf2bab1654bff024f2f5b236321a1f132f85" gracePeriod=30 Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.160992 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="24e52665-f55b-4137-82ec-7ab7392bca61" containerName="sg-core" containerID="cri-o://5ca1278161feea90d77b20382284269477a0f48da0fd9a28f3f40443811b3cac" gracePeriod=30 Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.276792 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:26:06 crc kubenswrapper[4749]: W1001 13:26:06.296479 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7be5a817_c26e_4fcf_8da2_d61f9b58c2a6.slice/crio-16ebf550cb3786752dfb951face918f56bd4a6c64811eb5e618fc4460c617536 WatchSource:0}: Error finding container 16ebf550cb3786752dfb951face918f56bd4a6c64811eb5e618fc4460c617536: Status 404 returned error can't find the container with id 16ebf550cb3786752dfb951face918f56bd4a6c64811eb5e618fc4460c617536 Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.698912 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84f6c74b66-trbb9" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Oct 01 13:26:06 crc kubenswrapper[4749]: I1001 13:26:06.699393 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.169030 4749 generic.go:334] "Generic (PLEG): container finished" podID="f24660fa-225d-4bf6-a8bb-f4f9df4f24e6" containerID="ef2a7167ba02a631ff0a903a9b47647f05767651def8ee9c634b07acee08f525" exitCode=0 Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.169133 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" event={"ID":"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6","Type":"ContainerDied","Data":"ef2a7167ba02a631ff0a903a9b47647f05767651def8ee9c634b07acee08f525"} Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.170184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6","Type":"ContainerStarted","Data":"16ebf550cb3786752dfb951face918f56bd4a6c64811eb5e618fc4460c617536"} Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.188842 4749 generic.go:334] "Generic (PLEG): container finished" podID="24e52665-f55b-4137-82ec-7ab7392bca61" containerID="5ca1278161feea90d77b20382284269477a0f48da0fd9a28f3f40443811b3cac" exitCode=2 Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.189309 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e52665-f55b-4137-82ec-7ab7392bca61","Type":"ContainerDied","Data":"5ca1278161feea90d77b20382284269477a0f48da0fd9a28f3f40443811b3cac"} Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.455339 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.498680 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.700991 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.826519 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.843576 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-dns-swift-storage-0\") pod \"603bb41c-f663-4085-9036-ab29af4efa15\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.843670 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-dns-svc\") pod \"603bb41c-f663-4085-9036-ab29af4efa15\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.843715 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqkjz\" (UniqueName: \"kubernetes.io/projected/603bb41c-f663-4085-9036-ab29af4efa15-kube-api-access-kqkjz\") pod \"603bb41c-f663-4085-9036-ab29af4efa15\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.843797 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-ovsdbserver-nb\") pod \"603bb41c-f663-4085-9036-ab29af4efa15\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.843961 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-config\") pod \"603bb41c-f663-4085-9036-ab29af4efa15\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.844111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-ovsdbserver-sb\") pod \"603bb41c-f663-4085-9036-ab29af4efa15\" (UID: \"603bb41c-f663-4085-9036-ab29af4efa15\") " Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.881122 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-config" (OuterVolumeSpecName: "config") pod "603bb41c-f663-4085-9036-ab29af4efa15" (UID: "603bb41c-f663-4085-9036-ab29af4efa15"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.902435 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603bb41c-f663-4085-9036-ab29af4efa15-kube-api-access-kqkjz" (OuterVolumeSpecName: "kube-api-access-kqkjz") pod "603bb41c-f663-4085-9036-ab29af4efa15" (UID: "603bb41c-f663-4085-9036-ab29af4efa15"). InnerVolumeSpecName "kube-api-access-kqkjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.919688 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "603bb41c-f663-4085-9036-ab29af4efa15" (UID: "603bb41c-f663-4085-9036-ab29af4efa15"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.945885 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "603bb41c-f663-4085-9036-ab29af4efa15" (UID: "603bb41c-f663-4085-9036-ab29af4efa15"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.960818 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r8n6\" (UniqueName: \"kubernetes.io/projected/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-kube-api-access-7r8n6\") pod \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.960878 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-config\") pod \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.960947 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-ovsdbserver-nb\") pod \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.961091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-dns-svc\") pod \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.961132 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-dns-swift-storage-0\") pod \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.961180 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-ovsdbserver-sb\") pod \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\" (UID: \"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6\") " Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.961594 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.961609 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.961617 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.961626 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqkjz\" (UniqueName: \"kubernetes.io/projected/603bb41c-f663-4085-9036-ab29af4efa15-kube-api-access-kqkjz\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.961751 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "603bb41c-f663-4085-9036-ab29af4efa15" (UID: "603bb41c-f663-4085-9036-ab29af4efa15"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.976482 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-kube-api-access-7r8n6" (OuterVolumeSpecName: "kube-api-access-7r8n6") pod "f24660fa-225d-4bf6-a8bb-f4f9df4f24e6" (UID: "f24660fa-225d-4bf6-a8bb-f4f9df4f24e6"). InnerVolumeSpecName "kube-api-access-7r8n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.978982 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "603bb41c-f663-4085-9036-ab29af4efa15" (UID: "603bb41c-f663-4085-9036-ab29af4efa15"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:07 crc kubenswrapper[4749]: I1001 13:26:07.998375 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f24660fa-225d-4bf6-a8bb-f4f9df4f24e6" (UID: "f24660fa-225d-4bf6-a8bb-f4f9df4f24e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.005571 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f24660fa-225d-4bf6-a8bb-f4f9df4f24e6" (UID: "f24660fa-225d-4bf6-a8bb-f4f9df4f24e6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.006733 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f24660fa-225d-4bf6-a8bb-f4f9df4f24e6" (UID: "f24660fa-225d-4bf6-a8bb-f4f9df4f24e6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.006891 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f24660fa-225d-4bf6-a8bb-f4f9df4f24e6" (UID: "f24660fa-225d-4bf6-a8bb-f4f9df4f24e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.021421 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-config" (OuterVolumeSpecName: "config") pod "f24660fa-225d-4bf6-a8bb-f4f9df4f24e6" (UID: "f24660fa-225d-4bf6-a8bb-f4f9df4f24e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.063579 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.063819 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.063906 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.063967 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.064032 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r8n6\" (UniqueName: \"kubernetes.io/projected/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-kube-api-access-7r8n6\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.064097 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.064164 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/603bb41c-f663-4085-9036-ab29af4efa15-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.064248 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.202239 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f698d9cb-ntlhp" event={"ID":"60010c1d-09a2-4f43-9f80-c896d4d02945","Type":"ContainerStarted","Data":"10e12106c5dfffa2cadb4bca1b1d999b0999c41bc1f59a64d6b037fd834db6ed"} Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.204560 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-677dbd476-b92fx" event={"ID":"0e51f972-9f26-4b6b-8213-9261797a1ee0","Type":"ContainerStarted","Data":"52f0c281709e3c7df25b38d3175f903532c911aa40fa14372ff3a766e729c861"} Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.206681 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" event={"ID":"603bb41c-f663-4085-9036-ab29af4efa15","Type":"ContainerDied","Data":"ff1d77ca9d8a313fddd862a53bc691959c60bdce49a5346af1bdb675787ba161"} Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.206714 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f5c595757-9xvdk" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.206732 4749 scope.go:117] "RemoveContainer" containerID="ea772389df26868defaaa91ece26edc48c6dd3374e039c9f3dfddeb841cd7fee" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.208411 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" event={"ID":"b34ccf83-0af2-438a-ad7f-c79c6886db75","Type":"ContainerStarted","Data":"958fc4615311f4201bc99f778be02331e193477546d3eeeb867686d1ad7e430c"} Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.211841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d7c7544c-n7mlp" event={"ID":"680ec9d6-ccd3-4417-9919-7412600f23fb","Type":"ContainerStarted","Data":"1e13f59888e7103fa400000f627e2453a502b6b2491cf583aa7cb0833d582c7c"} Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.213726 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f38ffc0-1f36-4c0d-9829-770095534d91","Type":"ContainerStarted","Data":"8daaff65098af9b5ce22d4a46e81726de546d2e56aabe210fc184ab62a57872b"} Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.217747 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"696adfa9-0326-4d60-8a2d-c53ee267a249","Type":"ContainerStarted","Data":"9a13902c479b1c7e0d9fa5e80874bd55d3f8810114a85279c90acb553fd7964a"} Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.222519 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6","Type":"ContainerStarted","Data":"5c7117c92e176d55285b89f0ef32a676d9ac4084abb07d48c664a5ab6583f838"} Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.224640 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c48c76d4d-blnpd" event={"ID":"d658fa13-ec8c-4936-9015-531912d2e050","Type":"ContainerStarted","Data":"3b4f2110a7200bbd3dea79b60ae19d03800b62cc9162c4b42d603ece72c13f52"} Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.226139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" event={"ID":"f24660fa-225d-4bf6-a8bb-f4f9df4f24e6","Type":"ContainerDied","Data":"37c64d814b229f9fb6887c56b35d6870106e8fff7556857f217f921460fc157d"} Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.226151 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6955885c6f-b6gb9" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.226458 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.232149 4749 scope.go:117] "RemoveContainer" containerID="ef2a7167ba02a631ff0a903a9b47647f05767651def8ee9c634b07acee08f525" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.265273 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f5c595757-9xvdk"] Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.276149 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f5c595757-9xvdk"] Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.284480 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.325804 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6955885c6f-b6gb9"] Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.331594 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6955885c6f-b6gb9"] Oct 01 13:26:08 crc kubenswrapper[4749]: I1001 13:26:08.359934 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 01 13:26:09 crc kubenswrapper[4749]: I1001 13:26:09.241844 4749 generic.go:334] "Generic (PLEG): container finished" podID="b34ccf83-0af2-438a-ad7f-c79c6886db75" containerID="958fc4615311f4201bc99f778be02331e193477546d3eeeb867686d1ad7e430c" exitCode=0 Oct 01 13:26:09 crc kubenswrapper[4749]: I1001 13:26:09.242729 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603bb41c-f663-4085-9036-ab29af4efa15" path="/var/lib/kubelet/pods/603bb41c-f663-4085-9036-ab29af4efa15/volumes" Oct 01 13:26:09 crc kubenswrapper[4749]: I1001 13:26:09.243409 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f24660fa-225d-4bf6-a8bb-f4f9df4f24e6" path="/var/lib/kubelet/pods/f24660fa-225d-4bf6-a8bb-f4f9df4f24e6/volumes" Oct 01 13:26:09 crc kubenswrapper[4749]: I1001 13:26:09.244083 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 01 13:26:09 crc kubenswrapper[4749]: I1001 13:26:09.244103 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" event={"ID":"b34ccf83-0af2-438a-ad7f-c79c6886db75","Type":"ContainerDied","Data":"958fc4615311f4201bc99f778be02331e193477546d3eeeb867686d1ad7e430c"} Oct 01 13:26:09 crc kubenswrapper[4749]: I1001 13:26:09.302910 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=32.302888382 podStartE2EDuration="32.302888382s" podCreationTimestamp="2025-10-01 13:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:09.29369646 +0000 UTC m=+1229.347681359" watchObservedRunningTime="2025-10-01 13:26:09.302888382 +0000 UTC m=+1229.356873301" Oct 01 13:26:09 crc kubenswrapper[4749]: I1001 13:26:09.329648 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-58f698d9cb-ntlhp" podStartSLOduration=20.329627304 podStartE2EDuration="20.329627304s" podCreationTimestamp="2025-10-01 13:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:09.320658209 +0000 UTC m=+1229.374643108" watchObservedRunningTime="2025-10-01 13:26:09.329627304 +0000 UTC m=+1229.383612223" Oct 01 13:26:10 crc kubenswrapper[4749]: I1001 13:26:10.253950 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-677dbd476-b92fx" event={"ID":"0e51f972-9f26-4b6b-8213-9261797a1ee0","Type":"ContainerStarted","Data":"b6738598d8a834671252c2a29ed3e8578a4086eeb754c89e1f09917dc5a29405"} Oct 01 13:26:10 crc kubenswrapper[4749]: I1001 13:26:10.256531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c48c76d4d-blnpd" event={"ID":"d658fa13-ec8c-4936-9015-531912d2e050","Type":"ContainerStarted","Data":"21c48d30d51810699355b4aafbf313ede9fab00195b8e4464bf27bbe0323f82d"} Oct 01 13:26:10 crc kubenswrapper[4749]: I1001 13:26:10.258630 4749 generic.go:334] "Generic (PLEG): container finished" podID="24e52665-f55b-4137-82ec-7ab7392bca61" containerID="88a4616d6bd1607daba64394bdfadf2bab1654bff024f2f5b236321a1f132f85" exitCode=0 Oct 01 13:26:10 crc kubenswrapper[4749]: I1001 13:26:10.258712 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e52665-f55b-4137-82ec-7ab7392bca61","Type":"ContainerDied","Data":"88a4616d6bd1607daba64394bdfadf2bab1654bff024f2f5b236321a1f132f85"} Oct 01 13:26:10 crc kubenswrapper[4749]: I1001 13:26:10.260951 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" event={"ID":"b34ccf83-0af2-438a-ad7f-c79c6886db75","Type":"ContainerStarted","Data":"496383534a049afa7139823bfc265a218724711b434d4209d1f1f68e8eb07326"} Oct 01 13:26:10 crc kubenswrapper[4749]: I1001 13:26:10.263147 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59d7c7544c-n7mlp" event={"ID":"680ec9d6-ccd3-4417-9919-7412600f23fb","Type":"ContainerStarted","Data":"d76d870de32094faca2486d8d4b9ff31e24d5e746e86fd2141c9a452f273033d"} Oct 01 13:26:10 crc kubenswrapper[4749]: I1001 13:26:10.265364 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerName="watcher-decision-engine" containerID="cri-o://8a542d090d53c6a6a4483dcf44c10f45e9bcc3777789339ef819c7139b20d0ee" gracePeriod=30 Oct 01 13:26:10 crc kubenswrapper[4749]: I1001 13:26:10.265496 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f38ffc0-1f36-4c0d-9829-770095534d91","Type":"ContainerStarted","Data":"27a7c8d16c36fe7ae9ff601be709ab8003dd8855197b5b558dc33591e0ebdb31"} Oct 01 13:26:10 crc kubenswrapper[4749]: I1001 13:26:10.984196 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.140795 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-config-data\") pod \"24e52665-f55b-4137-82ec-7ab7392bca61\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.140941 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-sg-core-conf-yaml\") pod \"24e52665-f55b-4137-82ec-7ab7392bca61\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.140978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qvcs\" (UniqueName: \"kubernetes.io/projected/24e52665-f55b-4137-82ec-7ab7392bca61-kube-api-access-8qvcs\") pod \"24e52665-f55b-4137-82ec-7ab7392bca61\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.141039 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e52665-f55b-4137-82ec-7ab7392bca61-log-httpd\") pod \"24e52665-f55b-4137-82ec-7ab7392bca61\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.141090 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-scripts\") pod \"24e52665-f55b-4137-82ec-7ab7392bca61\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.141124 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e52665-f55b-4137-82ec-7ab7392bca61-run-httpd\") pod \"24e52665-f55b-4137-82ec-7ab7392bca61\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.141162 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-combined-ca-bundle\") pod \"24e52665-f55b-4137-82ec-7ab7392bca61\" (UID: \"24e52665-f55b-4137-82ec-7ab7392bca61\") " Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.142136 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24e52665-f55b-4137-82ec-7ab7392bca61-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "24e52665-f55b-4137-82ec-7ab7392bca61" (UID: "24e52665-f55b-4137-82ec-7ab7392bca61"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.142540 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24e52665-f55b-4137-82ec-7ab7392bca61-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "24e52665-f55b-4137-82ec-7ab7392bca61" (UID: "24e52665-f55b-4137-82ec-7ab7392bca61"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.153759 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-scripts" (OuterVolumeSpecName: "scripts") pod "24e52665-f55b-4137-82ec-7ab7392bca61" (UID: "24e52665-f55b-4137-82ec-7ab7392bca61"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.155075 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24e52665-f55b-4137-82ec-7ab7392bca61-kube-api-access-8qvcs" (OuterVolumeSpecName: "kube-api-access-8qvcs") pod "24e52665-f55b-4137-82ec-7ab7392bca61" (UID: "24e52665-f55b-4137-82ec-7ab7392bca61"). InnerVolumeSpecName "kube-api-access-8qvcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.181596 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24e52665-f55b-4137-82ec-7ab7392bca61" (UID: "24e52665-f55b-4137-82ec-7ab7392bca61"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.198587 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "24e52665-f55b-4137-82ec-7ab7392bca61" (UID: "24e52665-f55b-4137-82ec-7ab7392bca61"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.205392 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-config-data" (OuterVolumeSpecName: "config-data") pod "24e52665-f55b-4137-82ec-7ab7392bca61" (UID: "24e52665-f55b-4137-82ec-7ab7392bca61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.243446 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qvcs\" (UniqueName: \"kubernetes.io/projected/24e52665-f55b-4137-82ec-7ab7392bca61-kube-api-access-8qvcs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.243495 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e52665-f55b-4137-82ec-7ab7392bca61-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.243505 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.243516 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/24e52665-f55b-4137-82ec-7ab7392bca61-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.243525 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.243556 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.243565 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/24e52665-f55b-4137-82ec-7ab7392bca61-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.284506 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"24e52665-f55b-4137-82ec-7ab7392bca61","Type":"ContainerDied","Data":"13d2c646ddbe1198028ba22d41a228313966c057b9ba0a84fda242c2d4fd0deb"} Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.284556 4749 scope.go:117] "RemoveContainer" containerID="5ca1278161feea90d77b20382284269477a0f48da0fd9a28f3f40443811b3cac" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.284668 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.300353 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" containerName="glance-log" containerID="cri-o://5c7117c92e176d55285b89f0ef32a676d9ac4084abb07d48c664a5ab6583f838" gracePeriod=30 Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.300457 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6","Type":"ContainerStarted","Data":"471780b2bdc8745aebe7cfe840d05d191acd2d01fec415465f610d75fa940296"} Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.300490 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.300998 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.301636 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.301743 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6f38ffc0-1f36-4c0d-9829-770095534d91" containerName="glance-log" containerID="cri-o://8daaff65098af9b5ce22d4a46e81726de546d2e56aabe210fc184ab62a57872b" gracePeriod=30 Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.302307 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.302347 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.302426 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" containerName="glance-httpd" containerID="cri-o://471780b2bdc8745aebe7cfe840d05d191acd2d01fec415465f610d75fa940296" gracePeriod=30 Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.302683 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.302768 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6f38ffc0-1f36-4c0d-9829-770095534d91" containerName="glance-httpd" containerID="cri-o://27a7c8d16c36fe7ae9ff601be709ab8003dd8855197b5b558dc33591e0ebdb31" gracePeriod=30 Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.465938 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-677dbd476-b92fx" podStartSLOduration=14.465920633 podStartE2EDuration="14.465920633s" podCreationTimestamp="2025-10-01 13:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:11.437976075 +0000 UTC m=+1231.491960974" watchObservedRunningTime="2025-10-01 13:26:11.465920633 +0000 UTC m=+1231.519905532" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.473514 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=34.473496368 podStartE2EDuration="34.473496368s" podCreationTimestamp="2025-10-01 13:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:11.458262456 +0000 UTC m=+1231.512247355" watchObservedRunningTime="2025-10-01 13:26:11.473496368 +0000 UTC m=+1231.527481267" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.485809 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5c48c76d4d-blnpd" podStartSLOduration=16.485794132 podStartE2EDuration="16.485794132s" podCreationTimestamp="2025-10-01 13:25:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:11.478049493 +0000 UTC m=+1231.532034402" watchObservedRunningTime="2025-10-01 13:26:11.485794132 +0000 UTC m=+1231.539779031" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.500720 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-59d7c7544c-n7mlp" podStartSLOduration=20.500695513 podStartE2EDuration="20.500695513s" podCreationTimestamp="2025-10-01 13:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:11.498991543 +0000 UTC m=+1231.552976452" watchObservedRunningTime="2025-10-01 13:26:11.500695513 +0000 UTC m=+1231.554680422" Oct 01 13:26:11 crc kubenswrapper[4749]: E1001 13:26:11.513442 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7be5a817_c26e_4fcf_8da2_d61f9b58c2a6.slice/crio-5c7117c92e176d55285b89f0ef32a676d9ac4084abb07d48c664a5ab6583f838.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.549967 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=34.549950443 podStartE2EDuration="34.549950443s" podCreationTimestamp="2025-10-01 13:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:11.521476829 +0000 UTC m=+1231.575461728" watchObservedRunningTime="2025-10-01 13:26:11.549950443 +0000 UTC m=+1231.603935342" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.563345 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" podStartSLOduration=17.563318309 podStartE2EDuration="17.563318309s" podCreationTimestamp="2025-10-01 13:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:11.548607463 +0000 UTC m=+1231.602592362" watchObservedRunningTime="2025-10-01 13:26:11.563318309 +0000 UTC m=+1231.617303208" Oct 01 13:26:11 crc kubenswrapper[4749]: I1001 13:26:11.827108 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 01 13:26:12 crc kubenswrapper[4749]: I1001 13:26:12.063185 4749 scope.go:117] "RemoveContainer" containerID="88a4616d6bd1607daba64394bdfadf2bab1654bff024f2f5b236321a1f132f85" Oct 01 13:26:12 crc kubenswrapper[4749]: I1001 13:26:12.322365 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f38ffc0-1f36-4c0d-9829-770095534d91" containerID="27a7c8d16c36fe7ae9ff601be709ab8003dd8855197b5b558dc33591e0ebdb31" exitCode=0 Oct 01 13:26:12 crc kubenswrapper[4749]: I1001 13:26:12.322660 4749 generic.go:334] "Generic (PLEG): container finished" podID="6f38ffc0-1f36-4c0d-9829-770095534d91" containerID="8daaff65098af9b5ce22d4a46e81726de546d2e56aabe210fc184ab62a57872b" exitCode=143 Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:12.322724 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f38ffc0-1f36-4c0d-9829-770095534d91","Type":"ContainerDied","Data":"27a7c8d16c36fe7ae9ff601be709ab8003dd8855197b5b558dc33591e0ebdb31"} Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:12.322758 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f38ffc0-1f36-4c0d-9829-770095534d91","Type":"ContainerDied","Data":"8daaff65098af9b5ce22d4a46e81726de546d2e56aabe210fc184ab62a57872b"} Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:12.344691 4749 generic.go:334] "Generic (PLEG): container finished" podID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerID="8a542d090d53c6a6a4483dcf44c10f45e9bcc3777789339ef819c7139b20d0ee" exitCode=1 Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:12.344735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"fab3f7e4-b73c-4e23-839f-719e5b8ca205","Type":"ContainerDied","Data":"8a542d090d53c6a6a4483dcf44c10f45e9bcc3777789339ef819c7139b20d0ee"} Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:12.349035 4749 generic.go:334] "Generic (PLEG): container finished" podID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerID="070469cc902a7a3114f0e257a6257c4a6d561f9bd6e5bdf6b9ee86f6304d2e5f" exitCode=137 Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:12.349099 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84f6c74b66-trbb9" event={"ID":"7ae81866-7b3d-44f4-a4fd-5b49e2223352","Type":"ContainerDied","Data":"070469cc902a7a3114f0e257a6257c4a6d561f9bd6e5bdf6b9ee86f6304d2e5f"} Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:12.355844 4749 generic.go:334] "Generic (PLEG): container finished" podID="7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" containerID="471780b2bdc8745aebe7cfe840d05d191acd2d01fec415465f610d75fa940296" exitCode=0 Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:12.355894 4749 generic.go:334] "Generic (PLEG): container finished" podID="7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" containerID="5c7117c92e176d55285b89f0ef32a676d9ac4084abb07d48c664a5ab6583f838" exitCode=143 Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:12.355920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6","Type":"ContainerDied","Data":"471780b2bdc8745aebe7cfe840d05d191acd2d01fec415465f610d75fa940296"} Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:12.355979 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6","Type":"ContainerDied","Data":"5c7117c92e176d55285b89f0ef32a676d9ac4084abb07d48c664a5ab6583f838"} Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.143716 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.669790 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-85b88cb8b7-2gzx9"] Oct 01 13:26:18 crc kubenswrapper[4749]: E1001 13:26:13.674814 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603bb41c-f663-4085-9036-ab29af4efa15" containerName="init" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.674842 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="603bb41c-f663-4085-9036-ab29af4efa15" containerName="init" Oct 01 13:26:18 crc kubenswrapper[4749]: E1001 13:26:13.674896 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e52665-f55b-4137-82ec-7ab7392bca61" containerName="sg-core" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.674902 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e52665-f55b-4137-82ec-7ab7392bca61" containerName="sg-core" Oct 01 13:26:18 crc kubenswrapper[4749]: E1001 13:26:13.674922 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24e52665-f55b-4137-82ec-7ab7392bca61" containerName="ceilometer-notification-agent" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.674929 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="24e52665-f55b-4137-82ec-7ab7392bca61" containerName="ceilometer-notification-agent" Oct 01 13:26:18 crc kubenswrapper[4749]: E1001 13:26:13.674945 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24660fa-225d-4bf6-a8bb-f4f9df4f24e6" containerName="init" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.674950 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24660fa-225d-4bf6-a8bb-f4f9df4f24e6" containerName="init" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.675238 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="603bb41c-f663-4085-9036-ab29af4efa15" containerName="init" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.675251 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="24e52665-f55b-4137-82ec-7ab7392bca61" containerName="ceilometer-notification-agent" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.675265 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="24e52665-f55b-4137-82ec-7ab7392bca61" containerName="sg-core" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.675274 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24660fa-225d-4bf6-a8bb-f4f9df4f24e6" containerName="init" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.676240 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.685132 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.685330 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.685432 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.689294 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-85b88cb8b7-2gzx9"] Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.798918 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-config-data\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.798969 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-run-httpd\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.799010 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-combined-ca-bundle\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.799027 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-etc-swift\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.799051 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-public-tls-certs\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.799076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjd8v\" (UniqueName: \"kubernetes.io/projected/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-kube-api-access-sjd8v\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.799091 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-log-httpd\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.799115 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-internal-tls-certs\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.900924 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-etc-swift\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.900964 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-combined-ca-bundle\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.900997 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-public-tls-certs\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.901023 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjd8v\" (UniqueName: \"kubernetes.io/projected/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-kube-api-access-sjd8v\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.901039 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-log-httpd\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.901068 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-internal-tls-certs\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.901198 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-config-data\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.901243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-run-httpd\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.901712 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-run-httpd\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.909346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-public-tls-certs\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.909569 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-combined-ca-bundle\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.909682 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-log-httpd\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.909705 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-internal-tls-certs\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.929349 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-etc-swift\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.933024 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-config-data\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:13.940173 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjd8v\" (UniqueName: \"kubernetes.io/projected/6139ffc4-c70f-45d5-aa79-6fc7b79f2034-kube-api-access-sjd8v\") pod \"swift-proxy-85b88cb8b7-2gzx9\" (UID: \"6139ffc4-c70f-45d5-aa79-6fc7b79f2034\") " pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:14.000647 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:14.338009 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:14.419992 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:15.134140 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:15.226852 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f7f865789-9mjp6"] Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:15.227060 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" podUID="c79b5ea5-4a70-4868-b72a-bc8efb0cb967" containerName="dnsmasq-dns" containerID="cri-o://d2bfb15f46d0e5919d2d8e5d14975a28845e64239f7e1258b651e08a349a7316" gracePeriod=10 Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:16.163574 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-677dbd476-b92fx" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:16.217017 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c48c76d4d-blnpd"] Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:16.221428 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c48c76d4d-blnpd" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api-log" containerID="cri-o://3b4f2110a7200bbd3dea79b60ae19d03800b62cc9162c4b42d603ece72c13f52" gracePeriod=30 Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:16.221593 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c48c76d4d-blnpd" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api" containerID="cri-o://21c48d30d51810699355b4aafbf313ede9fab00195b8e4464bf27bbe0323f82d" gracePeriod=30 Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:16.237024 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c48c76d4d-blnpd" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": EOF" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:16.243488 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c48c76d4d-blnpd" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": EOF" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:16.244077 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c48c76d4d-blnpd" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": EOF" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:16.425573 4749 generic.go:334] "Generic (PLEG): container finished" podID="c79b5ea5-4a70-4868-b72a-bc8efb0cb967" containerID="d2bfb15f46d0e5919d2d8e5d14975a28845e64239f7e1258b651e08a349a7316" exitCode=0 Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:16.425611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" event={"ID":"c79b5ea5-4a70-4868-b72a-bc8efb0cb967","Type":"ContainerDied","Data":"d2bfb15f46d0e5919d2d8e5d14975a28845e64239f7e1258b651e08a349a7316"} Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:16.700006 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84f6c74b66-trbb9" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:17.819626 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" podUID="c79b5ea5-4a70-4868-b72a-bc8efb0cb967" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: connect: connection refused" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:18.143965 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:18.155910 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 01 13:26:18 crc kubenswrapper[4749]: I1001 13:26:18.446712 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.213352 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.251892 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.323468 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-config-data\") pod \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.323804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjbxh\" (UniqueName: \"kubernetes.io/projected/6f38ffc0-1f36-4c0d-9829-770095534d91-kube-api-access-tjbxh\") pod \"6f38ffc0-1f36-4c0d-9829-770095534d91\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.323858 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f38ffc0-1f36-4c0d-9829-770095534d91-httpd-run\") pod \"6f38ffc0-1f36-4c0d-9829-770095534d91\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.323950 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f38ffc0-1f36-4c0d-9829-770095534d91-logs\") pod \"6f38ffc0-1f36-4c0d-9829-770095534d91\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.323979 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-combined-ca-bundle\") pod \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.324018 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fab3f7e4-b73c-4e23-839f-719e5b8ca205-logs\") pod \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.324033 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p55lr\" (UniqueName: \"kubernetes.io/projected/fab3f7e4-b73c-4e23-839f-719e5b8ca205-kube-api-access-p55lr\") pod \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.324101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-custom-prometheus-ca\") pod \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\" (UID: \"fab3f7e4-b73c-4e23-839f-719e5b8ca205\") " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.324141 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6f38ffc0-1f36-4c0d-9829-770095534d91\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.324165 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-combined-ca-bundle\") pod \"6f38ffc0-1f36-4c0d-9829-770095534d91\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.324184 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-config-data\") pod \"6f38ffc0-1f36-4c0d-9829-770095534d91\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.324197 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-scripts\") pod \"6f38ffc0-1f36-4c0d-9829-770095534d91\" (UID: \"6f38ffc0-1f36-4c0d-9829-770095534d91\") " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.326029 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fab3f7e4-b73c-4e23-839f-719e5b8ca205-logs" (OuterVolumeSpecName: "logs") pod "fab3f7e4-b73c-4e23-839f-719e5b8ca205" (UID: "fab3f7e4-b73c-4e23-839f-719e5b8ca205"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.326286 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f38ffc0-1f36-4c0d-9829-770095534d91-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6f38ffc0-1f36-4c0d-9829-770095534d91" (UID: "6f38ffc0-1f36-4c0d-9829-770095534d91"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.326457 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f38ffc0-1f36-4c0d-9829-770095534d91-logs" (OuterVolumeSpecName: "logs") pod "6f38ffc0-1f36-4c0d-9829-770095534d91" (UID: "6f38ffc0-1f36-4c0d-9829-770095534d91"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.334349 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-scripts" (OuterVolumeSpecName: "scripts") pod "6f38ffc0-1f36-4c0d-9829-770095534d91" (UID: "6f38ffc0-1f36-4c0d-9829-770095534d91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.336512 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "6f38ffc0-1f36-4c0d-9829-770095534d91" (UID: "6f38ffc0-1f36-4c0d-9829-770095534d91"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.342367 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f38ffc0-1f36-4c0d-9829-770095534d91-kube-api-access-tjbxh" (OuterVolumeSpecName: "kube-api-access-tjbxh") pod "6f38ffc0-1f36-4c0d-9829-770095534d91" (UID: "6f38ffc0-1f36-4c0d-9829-770095534d91"). InnerVolumeSpecName "kube-api-access-tjbxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.349522 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab3f7e4-b73c-4e23-839f-719e5b8ca205-kube-api-access-p55lr" (OuterVolumeSpecName: "kube-api-access-p55lr") pod "fab3f7e4-b73c-4e23-839f-719e5b8ca205" (UID: "fab3f7e4-b73c-4e23-839f-719e5b8ca205"). InnerVolumeSpecName "kube-api-access-p55lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.378427 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c48c76d4d-blnpd" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": read tcp 10.217.0.2:52454->10.217.0.178:9311: read: connection reset by peer" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.388536 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f38ffc0-1f36-4c0d-9829-770095534d91" (UID: "6f38ffc0-1f36-4c0d-9829-770095534d91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.392394 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fab3f7e4-b73c-4e23-839f-719e5b8ca205" (UID: "fab3f7e4-b73c-4e23-839f-719e5b8ca205"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.409865 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "fab3f7e4-b73c-4e23-839f-719e5b8ca205" (UID: "fab3f7e4-b73c-4e23-839f-719e5b8ca205"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.418605 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-config-data" (OuterVolumeSpecName: "config-data") pod "fab3f7e4-b73c-4e23-839f-719e5b8ca205" (UID: "fab3f7e4-b73c-4e23-839f-719e5b8ca205"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.426511 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.426544 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjbxh\" (UniqueName: \"kubernetes.io/projected/6f38ffc0-1f36-4c0d-9829-770095534d91-kube-api-access-tjbxh\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.426559 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f38ffc0-1f36-4c0d-9829-770095534d91-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.426573 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f38ffc0-1f36-4c0d-9829-770095534d91-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.426581 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.426589 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p55lr\" (UniqueName: \"kubernetes.io/projected/fab3f7e4-b73c-4e23-839f-719e5b8ca205-kube-api-access-p55lr\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.426597 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fab3f7e4-b73c-4e23-839f-719e5b8ca205-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.426605 4749 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/fab3f7e4-b73c-4e23-839f-719e5b8ca205-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.426631 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.427661 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.427681 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.445024 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-config-data" (OuterVolumeSpecName: "config-data") pod "6f38ffc0-1f36-4c0d-9829-770095534d91" (UID: "6f38ffc0-1f36-4c0d-9829-770095534d91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.456454 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6f38ffc0-1f36-4c0d-9829-770095534d91","Type":"ContainerDied","Data":"e8940bcda7db698e8b0c84de7bcc49e9d3a29942f9ae8004653266e140afd8e4"} Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.456570 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.460180 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.460472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"fab3f7e4-b73c-4e23-839f-719e5b8ca205","Type":"ContainerDied","Data":"19e258b3a3042561a761627661d9140ad065447df9a175e2385e71b6291795c8"} Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.463497 4749 generic.go:334] "Generic (PLEG): container finished" podID="d658fa13-ec8c-4936-9015-531912d2e050" containerID="21c48d30d51810699355b4aafbf313ede9fab00195b8e4464bf27bbe0323f82d" exitCode=0 Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.463520 4749 generic.go:334] "Generic (PLEG): container finished" podID="d658fa13-ec8c-4936-9015-531912d2e050" containerID="3b4f2110a7200bbd3dea79b60ae19d03800b62cc9162c4b42d603ece72c13f52" exitCode=143 Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.464308 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c48c76d4d-blnpd" event={"ID":"d658fa13-ec8c-4936-9015-531912d2e050","Type":"ContainerDied","Data":"21c48d30d51810699355b4aafbf313ede9fab00195b8e4464bf27bbe0323f82d"} Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.464331 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c48c76d4d-blnpd" event={"ID":"d658fa13-ec8c-4936-9015-531912d2e050","Type":"ContainerDied","Data":"3b4f2110a7200bbd3dea79b60ae19d03800b62cc9162c4b42d603ece72c13f52"} Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.484189 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.513717 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.525405 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.534706 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.534737 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f38ffc0-1f36-4c0d-9829-770095534d91-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.561367 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.571325 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.584968 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:26:19 crc kubenswrapper[4749]: E1001 13:26:19.585425 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f38ffc0-1f36-4c0d-9829-770095534d91" containerName="glance-log" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.585443 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f38ffc0-1f36-4c0d-9829-770095534d91" containerName="glance-log" Oct 01 13:26:19 crc kubenswrapper[4749]: E1001 13:26:19.585465 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerName="watcher-decision-engine" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.585472 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerName="watcher-decision-engine" Oct 01 13:26:19 crc kubenswrapper[4749]: E1001 13:26:19.585492 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerName="watcher-decision-engine" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.585499 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerName="watcher-decision-engine" Oct 01 13:26:19 crc kubenswrapper[4749]: E1001 13:26:19.585515 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f38ffc0-1f36-4c0d-9829-770095534d91" containerName="glance-httpd" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.585521 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f38ffc0-1f36-4c0d-9829-770095534d91" containerName="glance-httpd" Oct 01 13:26:19 crc kubenswrapper[4749]: E1001 13:26:19.585529 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerName="watcher-decision-engine" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.585535 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerName="watcher-decision-engine" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.585724 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerName="watcher-decision-engine" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.585752 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f38ffc0-1f36-4c0d-9829-770095534d91" containerName="glance-log" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.585767 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f38ffc0-1f36-4c0d-9829-770095534d91" containerName="glance-httpd" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.585776 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerName="watcher-decision-engine" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.586114 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" containerName="watcher-decision-engine" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.586821 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.592031 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.592648 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.593186 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.604846 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-58f698d9cb-ntlhp" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.605032 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-58f698d9cb-ntlhp" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.610709 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.614258 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-58f698d9cb-ntlhp" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.614327 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-58f698d9cb-ntlhp" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.636578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32fe2369-6499-4f14-8c75-33933c8bb608-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.636699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.636727 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcz9r\" (UniqueName: \"kubernetes.io/projected/32fe2369-6499-4f14-8c75-33933c8bb608-kube-api-access-hcz9r\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.636765 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.636793 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32fe2369-6499-4f14-8c75-33933c8bb608-logs\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.636846 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-config-data\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.636987 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.637044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-scripts\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.642419 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.644977 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.650744 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.679600 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.738831 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f0334e5-add1-4ced-bad4-7e77d528e28a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.738876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.738914 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-scripts\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.739031 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32fe2369-6499-4f14-8c75-33933c8bb608-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.739176 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.739229 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.739450 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcz9r\" (UniqueName: \"kubernetes.io/projected/32fe2369-6499-4f14-8c75-33933c8bb608-kube-api-access-hcz9r\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.739491 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.739518 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99dlt\" (UniqueName: \"kubernetes.io/projected/3f0334e5-add1-4ced-bad4-7e77d528e28a-kube-api-access-99dlt\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.739559 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32fe2369-6499-4f14-8c75-33933c8bb608-logs\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.739595 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0334e5-add1-4ced-bad4-7e77d528e28a-logs\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.739624 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3f0334e5-add1-4ced-bad4-7e77d528e28a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.739654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32fe2369-6499-4f14-8c75-33933c8bb608-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.739694 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-config-data\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.739781 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f0334e5-add1-4ced-bad4-7e77d528e28a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.741761 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32fe2369-6499-4f14-8c75-33933c8bb608-logs\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.744822 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.744922 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-config-data\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.746630 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-scripts\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.746784 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.765813 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcz9r\" (UniqueName: \"kubernetes.io/projected/32fe2369-6499-4f14-8c75-33933c8bb608-kube-api-access-hcz9r\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.791095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.842402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f0334e5-add1-4ced-bad4-7e77d528e28a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.842515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99dlt\" (UniqueName: \"kubernetes.io/projected/3f0334e5-add1-4ced-bad4-7e77d528e28a-kube-api-access-99dlt\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.842540 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0334e5-add1-4ced-bad4-7e77d528e28a-logs\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.842555 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3f0334e5-add1-4ced-bad4-7e77d528e28a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.842593 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f0334e5-add1-4ced-bad4-7e77d528e28a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.843251 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0334e5-add1-4ced-bad4-7e77d528e28a-logs\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.846993 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f0334e5-add1-4ced-bad4-7e77d528e28a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.849054 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/3f0334e5-add1-4ced-bad4-7e77d528e28a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.851268 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f0334e5-add1-4ced-bad4-7e77d528e28a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.862966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99dlt\" (UniqueName: \"kubernetes.io/projected/3f0334e5-add1-4ced-bad4-7e77d528e28a-kube-api-access-99dlt\") pod \"watcher-decision-engine-0\" (UID: \"3f0334e5-add1-4ced-bad4-7e77d528e28a\") " pod="openstack/watcher-decision-engine-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.915626 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:26:19 crc kubenswrapper[4749]: I1001 13:26:19.971717 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:20 crc kubenswrapper[4749]: I1001 13:26:20.468469 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c48c76d4d-blnpd" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": dial tcp 10.217.0.178:9311: connect: connection refused" Oct 01 13:26:20 crc kubenswrapper[4749]: I1001 13:26:20.469070 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c48c76d4d-blnpd" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": dial tcp 10.217.0.178:9311: connect: connection refused" Oct 01 13:26:21 crc kubenswrapper[4749]: I1001 13:26:21.242766 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f38ffc0-1f36-4c0d-9829-770095534d91" path="/var/lib/kubelet/pods/6f38ffc0-1f36-4c0d-9829-770095534d91/volumes" Oct 01 13:26:21 crc kubenswrapper[4749]: I1001 13:26:21.244021 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fab3f7e4-b73c-4e23-839f-719e5b8ca205" path="/var/lib/kubelet/pods/fab3f7e4-b73c-4e23-839f-719e5b8ca205/volumes" Oct 01 13:26:22 crc kubenswrapper[4749]: I1001 13:26:22.444643 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-59d7c7544c-n7mlp" Oct 01 13:26:22 crc kubenswrapper[4749]: I1001 13:26:22.509470 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-58f698d9cb-ntlhp"] Oct 01 13:26:22 crc kubenswrapper[4749]: I1001 13:26:22.509689 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-58f698d9cb-ntlhp" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerName="neutron-api" containerID="cri-o://f8206c69be3f6ecbb17fa47a68cfad645dc99620b381ea0e0e35641265e10873" gracePeriod=30 Oct 01 13:26:22 crc kubenswrapper[4749]: I1001 13:26:22.509894 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-58f698d9cb-ntlhp" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerName="neutron-httpd" containerID="cri-o://10e12106c5dfffa2cadb4bca1b1d999b0999c41bc1f59a64d6b037fd834db6ed" gracePeriod=30 Oct 01 13:26:22 crc kubenswrapper[4749]: I1001 13:26:22.521611 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:26:23 crc kubenswrapper[4749]: I1001 13:26:23.513658 4749 generic.go:334] "Generic (PLEG): container finished" podID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerID="10e12106c5dfffa2cadb4bca1b1d999b0999c41bc1f59a64d6b037fd834db6ed" exitCode=0 Oct 01 13:26:23 crc kubenswrapper[4749]: I1001 13:26:23.513943 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f698d9cb-ntlhp" event={"ID":"60010c1d-09a2-4f43-9f80-c896d4d02945","Type":"ContainerDied","Data":"10e12106c5dfffa2cadb4bca1b1d999b0999c41bc1f59a64d6b037fd834db6ed"} Oct 01 13:26:24 crc kubenswrapper[4749]: I1001 13:26:24.526326 4749 generic.go:334] "Generic (PLEG): container finished" podID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerID="f8206c69be3f6ecbb17fa47a68cfad645dc99620b381ea0e0e35641265e10873" exitCode=0 Oct 01 13:26:24 crc kubenswrapper[4749]: I1001 13:26:24.526368 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f698d9cb-ntlhp" event={"ID":"60010c1d-09a2-4f43-9f80-c896d4d02945","Type":"ContainerDied","Data":"f8206c69be3f6ecbb17fa47a68cfad645dc99620b381ea0e0e35641265e10873"} Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.095173 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.153975 4749 scope.go:117] "RemoveContainer" containerID="a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.161468 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-logs\") pod \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.161519 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-combined-ca-bundle\") pod \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.161545 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-httpd-run\") pod \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.161613 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5k7k\" (UniqueName: \"kubernetes.io/projected/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-kube-api-access-h5k7k\") pod \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.161669 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.162021 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-logs" (OuterVolumeSpecName: "logs") pod "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" (UID: "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.162312 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" (UID: "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.162420 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-scripts\") pod \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.162471 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-config-data\") pod \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\" (UID: \"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.162828 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.162844 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.169524 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-scripts" (OuterVolumeSpecName: "scripts") pod "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" (UID: "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.169547 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" (UID: "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.173266 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-kube-api-access-h5k7k" (OuterVolumeSpecName: "kube-api-access-h5k7k") pod "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" (UID: "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6"). InnerVolumeSpecName "kube-api-access-h5k7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.198001 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" (UID: "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.239870 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-config-data" (OuterVolumeSpecName: "config-data") pod "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" (UID: "7be5a817-c26e-4fcf-8da2-d61f9b58c2a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.268434 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.268468 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5k7k\" (UniqueName: \"kubernetes.io/projected/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-kube-api-access-h5k7k\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.268502 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.268513 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.268526 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.327944 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.371259 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.379189 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.387751 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.395483 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-ljzz5"] Oct 01 13:26:25 crc kubenswrapper[4749]: E1001 13:26:25.395909 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" containerName="glance-httpd" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.395921 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" containerName="glance-httpd" Oct 01 13:26:25 crc kubenswrapper[4749]: E1001 13:26:25.395939 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79b5ea5-4a70-4868-b72a-bc8efb0cb967" containerName="dnsmasq-dns" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.395945 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79b5ea5-4a70-4868-b72a-bc8efb0cb967" containerName="dnsmasq-dns" Oct 01 13:26:25 crc kubenswrapper[4749]: E1001 13:26:25.395956 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerName="horizon" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.395962 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerName="horizon" Oct 01 13:26:25 crc kubenswrapper[4749]: E1001 13:26:25.395972 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c79b5ea5-4a70-4868-b72a-bc8efb0cb967" containerName="init" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.395978 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c79b5ea5-4a70-4868-b72a-bc8efb0cb967" containerName="init" Oct 01 13:26:25 crc kubenswrapper[4749]: E1001 13:26:25.395988 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerName="horizon-log" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.395994 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerName="horizon-log" Oct 01 13:26:25 crc kubenswrapper[4749]: E1001 13:26:25.396029 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" containerName="glance-log" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.396034 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" containerName="glance-log" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.396210 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerName="horizon-log" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.396234 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" containerName="horizon" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.396248 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" containerName="glance-log" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.396256 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" containerName="glance-httpd" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.396276 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c79b5ea5-4a70-4868-b72a-bc8efb0cb967" containerName="dnsmasq-dns" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.402057 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ljzz5" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.419357 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ljzz5"] Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.433298 4749 scope.go:117] "RemoveContainer" containerID="27a7c8d16c36fe7ae9ff601be709ab8003dd8855197b5b558dc33591e0ebdb31" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.576649 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae81866-7b3d-44f4-a4fd-5b49e2223352-logs\") pod \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.580491 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-horizon-tls-certs\") pod \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.580526 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ae81866-7b3d-44f4-a4fd-5b49e2223352-config-data\") pod \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.580568 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ae81866-7b3d-44f4-a4fd-5b49e2223352-scripts\") pod \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.580653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-combined-ca-bundle\") pod \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.580671 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-ovsdbserver-nb\") pod \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.580691 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-horizon-secret-key\") pod \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.580719 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-dns-swift-storage-0\") pod \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.580744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-config\") pod \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.580814 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4278f\" (UniqueName: \"kubernetes.io/projected/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-kube-api-access-4278f\") pod \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.580831 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-dns-svc\") pod \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.580851 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-ovsdbserver-sb\") pod \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\" (UID: \"c79b5ea5-4a70-4868-b72a-bc8efb0cb967\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.580893 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjjtp\" (UniqueName: \"kubernetes.io/projected/7ae81866-7b3d-44f4-a4fd-5b49e2223352-kube-api-access-bjjtp\") pod \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\" (UID: \"7ae81866-7b3d-44f4-a4fd-5b49e2223352\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.581104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc752\" (UniqueName: \"kubernetes.io/projected/3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6-kube-api-access-nc752\") pod \"nova-api-db-create-ljzz5\" (UID: \"3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6\") " pod="openstack/nova-api-db-create-ljzz5" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.589011 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7ae81866-7b3d-44f4-a4fd-5b49e2223352" (UID: "7ae81866-7b3d-44f4-a4fd-5b49e2223352"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.600356 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ae81866-7b3d-44f4-a4fd-5b49e2223352-logs" (OuterVolumeSpecName: "logs") pod "7ae81866-7b3d-44f4-a4fd-5b49e2223352" (UID: "7ae81866-7b3d-44f4-a4fd-5b49e2223352"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.603495 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.603512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7be5a817-c26e-4fcf-8da2-d61f9b58c2a6","Type":"ContainerDied","Data":"16ebf550cb3786752dfb951face918f56bd4a6c64811eb5e618fc4460c617536"} Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.608466 4749 scope.go:117] "RemoveContainer" containerID="8daaff65098af9b5ce22d4a46e81726de546d2e56aabe210fc184ab62a57872b" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.625796 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-kube-api-access-4278f" (OuterVolumeSpecName: "kube-api-access-4278f") pod "c79b5ea5-4a70-4868-b72a-bc8efb0cb967" (UID: "c79b5ea5-4a70-4868-b72a-bc8efb0cb967"). InnerVolumeSpecName "kube-api-access-4278f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.633286 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-d8mdh"] Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.658674 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.662667 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-d8mdh"] Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.662704 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" event={"ID":"c79b5ea5-4a70-4868-b72a-bc8efb0cb967","Type":"ContainerDied","Data":"83fcbcc736369e1a4f2afdc9eec350e8f2c7c2b324865725270b38bcb547c6f6"} Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.662737 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hghc5"] Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.673554 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d8mdh" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.679501 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hghc5"] Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.679601 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hghc5" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.681796 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae81866-7b3d-44f4-a4fd-5b49e2223352-kube-api-access-bjjtp" (OuterVolumeSpecName: "kube-api-access-bjjtp") pod "7ae81866-7b3d-44f4-a4fd-5b49e2223352" (UID: "7ae81866-7b3d-44f4-a4fd-5b49e2223352"). InnerVolumeSpecName "kube-api-access-bjjtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.683085 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc752\" (UniqueName: \"kubernetes.io/projected/3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6-kube-api-access-nc752\") pod \"nova-api-db-create-ljzz5\" (UID: \"3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6\") " pod="openstack/nova-api-db-create-ljzz5" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.683234 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.683249 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4278f\" (UniqueName: \"kubernetes.io/projected/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-kube-api-access-4278f\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.683259 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjjtp\" (UniqueName: \"kubernetes.io/projected/7ae81866-7b3d-44f4-a4fd-5b49e2223352-kube-api-access-bjjtp\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.683269 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ae81866-7b3d-44f4-a4fd-5b49e2223352-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.690621 4749 scope.go:117] "RemoveContainer" containerID="8a542d090d53c6a6a4483dcf44c10f45e9bcc3777789339ef819c7139b20d0ee" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.704072 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84f6c74b66-trbb9" event={"ID":"7ae81866-7b3d-44f4-a4fd-5b49e2223352","Type":"ContainerDied","Data":"1bc40e0de8b2d1ef8f2d2792a39a5ca4e4e09a2a1e2054a2fcd8c584b0d11f60"} Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.704157 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84f6c74b66-trbb9" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.735143 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc752\" (UniqueName: \"kubernetes.io/projected/3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6-kube-api-access-nc752\") pod \"nova-api-db-create-ljzz5\" (UID: \"3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6\") " pod="openstack/nova-api-db-create-ljzz5" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.755987 4749 scope.go:117] "RemoveContainer" containerID="a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7" Oct 01 13:26:25 crc kubenswrapper[4749]: E1001 13:26:25.762076 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7\": container with ID starting with a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7 not found: ID does not exist" containerID="a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.762120 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7"} err="failed to get container status \"a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7\": rpc error: code = NotFound desc = could not find container \"a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7\": container with ID starting with a564fcc1ebd30a1375118d964c9bc6f68d498cb7bb16068d85b6bc4cb87326e7 not found: ID does not exist" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.762151 4749 scope.go:117] "RemoveContainer" containerID="471780b2bdc8745aebe7cfe840d05d191acd2d01fec415465f610d75fa940296" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.768395 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.783543 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ljzz5" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.784669 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.787681 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-422r9\" (UniqueName: \"kubernetes.io/projected/9b1f0790-fd61-4075-b031-cd82fa151ab8-kube-api-access-422r9\") pod \"nova-cell1-db-create-hghc5\" (UID: \"9b1f0790-fd61-4075-b031-cd82fa151ab8\") " pod="openstack/nova-cell1-db-create-hghc5" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.787815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wffn\" (UniqueName: \"kubernetes.io/projected/0cf66c1c-176c-4671-9887-07295eb47200-kube-api-access-6wffn\") pod \"nova-cell0-db-create-d8mdh\" (UID: \"0cf66c1c-176c-4671-9887-07295eb47200\") " pod="openstack/nova-cell0-db-create-d8mdh" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.790513 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.812802 4749 scope.go:117] "RemoveContainer" containerID="5c7117c92e176d55285b89f0ef32a676d9ac4084abb07d48c664a5ab6583f838" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.825430 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:26:25 crc kubenswrapper[4749]: E1001 13:26:25.825857 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.825868 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api" Oct 01 13:26:25 crc kubenswrapper[4749]: E1001 13:26:25.825888 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api-log" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.825894 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api-log" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.826063 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api-log" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.826075 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.827024 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.836708 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.838014 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.844382 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.889164 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wffn\" (UniqueName: \"kubernetes.io/projected/0cf66c1c-176c-4671-9887-07295eb47200-kube-api-access-6wffn\") pod \"nova-cell0-db-create-d8mdh\" (UID: \"0cf66c1c-176c-4671-9887-07295eb47200\") " pod="openstack/nova-cell0-db-create-d8mdh" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.889589 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-422r9\" (UniqueName: \"kubernetes.io/projected/9b1f0790-fd61-4075-b031-cd82fa151ab8-kube-api-access-422r9\") pod \"nova-cell1-db-create-hghc5\" (UID: \"9b1f0790-fd61-4075-b031-cd82fa151ab8\") " pod="openstack/nova-cell1-db-create-hghc5" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.916347 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-422r9\" (UniqueName: \"kubernetes.io/projected/9b1f0790-fd61-4075-b031-cd82fa151ab8-kube-api-access-422r9\") pod \"nova-cell1-db-create-hghc5\" (UID: \"9b1f0790-fd61-4075-b031-cd82fa151ab8\") " pod="openstack/nova-cell1-db-create-hghc5" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.922967 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wffn\" (UniqueName: \"kubernetes.io/projected/0cf66c1c-176c-4671-9887-07295eb47200-kube-api-access-6wffn\") pod \"nova-cell0-db-create-d8mdh\" (UID: \"0cf66c1c-176c-4671-9887-07295eb47200\") " pod="openstack/nova-cell0-db-create-d8mdh" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.957267 4749 scope.go:117] "RemoveContainer" containerID="d2bfb15f46d0e5919d2d8e5d14975a28845e64239f7e1258b651e08a349a7316" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.976513 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae81866-7b3d-44f4-a4fd-5b49e2223352-config-data" (OuterVolumeSpecName: "config-data") pod "7ae81866-7b3d-44f4-a4fd-5b49e2223352" (UID: "7ae81866-7b3d-44f4-a4fd-5b49e2223352"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.994885 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjwxh\" (UniqueName: \"kubernetes.io/projected/d658fa13-ec8c-4936-9015-531912d2e050-kube-api-access-tjwxh\") pod \"d658fa13-ec8c-4936-9015-531912d2e050\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.995070 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-config-data-custom\") pod \"d658fa13-ec8c-4936-9015-531912d2e050\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.995246 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d658fa13-ec8c-4936-9015-531912d2e050-logs\") pod \"d658fa13-ec8c-4936-9015-531912d2e050\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.995306 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-config-data\") pod \"d658fa13-ec8c-4936-9015-531912d2e050\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.995337 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-combined-ca-bundle\") pod \"d658fa13-ec8c-4936-9015-531912d2e050\" (UID: \"d658fa13-ec8c-4936-9015-531912d2e050\") " Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.995665 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.995743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81ccc37c-c732-4d54-b526-b70f51a5af62-logs\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.995785 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81ccc37c-c732-4d54-b526-b70f51a5af62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.995782 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d658fa13-ec8c-4936-9015-531912d2e050-logs" (OuterVolumeSpecName: "logs") pod "d658fa13-ec8c-4936-9015-531912d2e050" (UID: "d658fa13-ec8c-4936-9015-531912d2e050"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.995951 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.995976 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.996001 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.996060 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.996105 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc2vx\" (UniqueName: \"kubernetes.io/projected/81ccc37c-c732-4d54-b526-b70f51a5af62-kube-api-access-rc2vx\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.996331 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7ae81866-7b3d-44f4-a4fd-5b49e2223352-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:25 crc kubenswrapper[4749]: I1001 13:26:25.996343 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d658fa13-ec8c-4936-9015-531912d2e050-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.011661 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d8mdh" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.014096 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d658fa13-ec8c-4936-9015-531912d2e050-kube-api-access-tjwxh" (OuterVolumeSpecName: "kube-api-access-tjwxh") pod "d658fa13-ec8c-4936-9015-531912d2e050" (UID: "d658fa13-ec8c-4936-9015-531912d2e050"). InnerVolumeSpecName "kube-api-access-tjwxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.020570 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d658fa13-ec8c-4936-9015-531912d2e050" (UID: "d658fa13-ec8c-4936-9015-531912d2e050"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.029937 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hghc5" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.080393 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.098877 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.098963 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81ccc37c-c732-4d54-b526-b70f51a5af62-logs\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.099015 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81ccc37c-c732-4d54-b526-b70f51a5af62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.099089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.099116 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.099144 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.099186 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.099248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc2vx\" (UniqueName: \"kubernetes.io/projected/81ccc37c-c732-4d54-b526-b70f51a5af62-kube-api-access-rc2vx\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.099373 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjwxh\" (UniqueName: \"kubernetes.io/projected/d658fa13-ec8c-4936-9015-531912d2e050-kube-api-access-tjwxh\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.099389 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.100782 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.101567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c79b5ea5-4a70-4868-b72a-bc8efb0cb967" (UID: "c79b5ea5-4a70-4868-b72a-bc8efb0cb967"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.102388 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81ccc37c-c732-4d54-b526-b70f51a5af62-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.102394 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81ccc37c-c732-4d54-b526-b70f51a5af62-logs\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.113876 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d658fa13-ec8c-4936-9015-531912d2e050" (UID: "d658fa13-ec8c-4936-9015-531912d2e050"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.115541 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-config-data\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.123979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.132448 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-scripts\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.138257 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae81866-7b3d-44f4-a4fd-5b49e2223352-scripts" (OuterVolumeSpecName: "scripts") pod "7ae81866-7b3d-44f4-a4fd-5b49e2223352" (UID: "7ae81866-7b3d-44f4-a4fd-5b49e2223352"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.138499 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.146833 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc2vx\" (UniqueName: \"kubernetes.io/projected/81ccc37c-c732-4d54-b526-b70f51a5af62-kube-api-access-rc2vx\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.183372 4749 scope.go:117] "RemoveContainer" containerID="ad615b39edf900a955c90d5cbdd1a5684726124559af5f04e44d0233af379b78" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.195057 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ae81866-7b3d-44f4-a4fd-5b49e2223352" (UID: "7ae81866-7b3d-44f4-a4fd-5b49e2223352"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.204898 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7ae81866-7b3d-44f4-a4fd-5b49e2223352-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.204934 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.204946 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.204957 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.296484 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c79b5ea5-4a70-4868-b72a-bc8efb0cb967" (UID: "c79b5ea5-4a70-4868-b72a-bc8efb0cb967"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.305962 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.315857 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-85b88cb8b7-2gzx9"] Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.341783 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c79b5ea5-4a70-4868-b72a-bc8efb0cb967" (UID: "c79b5ea5-4a70-4868-b72a-bc8efb0cb967"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.352686 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "7ae81866-7b3d-44f4-a4fd-5b49e2223352" (UID: "7ae81866-7b3d-44f4-a4fd-5b49e2223352"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.379674 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c79b5ea5-4a70-4868-b72a-bc8efb0cb967" (UID: "c79b5ea5-4a70-4868-b72a-bc8efb0cb967"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.390678 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.409676 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.411151 4749 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae81866-7b3d-44f4-a4fd-5b49e2223352-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.411169 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.411180 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.421955 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-config" (OuterVolumeSpecName: "config") pod "c79b5ea5-4a70-4868-b72a-bc8efb0cb967" (UID: "c79b5ea5-4a70-4868-b72a-bc8efb0cb967"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.436909 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-config-data" (OuterVolumeSpecName: "config-data") pod "d658fa13-ec8c-4936-9015-531912d2e050" (UID: "d658fa13-ec8c-4936-9015-531912d2e050"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.455506 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.518345 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d658fa13-ec8c-4936-9015-531912d2e050-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.518380 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c79b5ea5-4a70-4868-b72a-bc8efb0cb967-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.564270 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.694372 4749 scope.go:117] "RemoveContainer" containerID="a94805c30f134bb40119ae76ad5949c8770747149e8665ebf0949b027b1ecb2a" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.741362 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x542\" (UniqueName: \"kubernetes.io/projected/60010c1d-09a2-4f43-9f80-c896d4d02945-kube-api-access-6x542\") pod \"60010c1d-09a2-4f43-9f80-c896d4d02945\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.741421 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-ovndb-tls-certs\") pod \"60010c1d-09a2-4f43-9f80-c896d4d02945\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.741460 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-combined-ca-bundle\") pod \"60010c1d-09a2-4f43-9f80-c896d4d02945\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.741512 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-httpd-config\") pod \"60010c1d-09a2-4f43-9f80-c896d4d02945\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.741553 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-config\") pod \"60010c1d-09a2-4f43-9f80-c896d4d02945\" (UID: \"60010c1d-09a2-4f43-9f80-c896d4d02945\") " Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.775669 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f7f865789-9mjp6"] Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.788142 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c6dfb5585-x78z5" event={"ID":"c8b1d3a9-044c-475f-b86f-7e099e2b1197","Type":"ContainerStarted","Data":"0f3ec247cbb8a598385eff39c85b6356b592a10ede2baab61f7199d92c8432d5"} Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.790603 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f7f865789-9mjp6"] Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.793061 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32fe2369-6499-4f14-8c75-33933c8bb608","Type":"ContainerStarted","Data":"8cf7afb6f23b9d89f42e17ee55ee06d743e43b8d163f81ab9529b09781afd762"} Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.796007 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f698d9cb-ntlhp" event={"ID":"60010c1d-09a2-4f43-9f80-c896d4d02945","Type":"ContainerDied","Data":"783402d68dd0b75d4435548f60b599d91355791d9e97c8a473e1328a2eab77c1"} Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.796101 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58f698d9cb-ntlhp" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.801289 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85b88cb8b7-2gzx9" event={"ID":"6139ffc4-c70f-45d5-aa79-6fc7b79f2034","Type":"ContainerStarted","Data":"1526e9ae7004ed44b7b57096855dafcccaaa715121e976f25a7c216ab3f5dcb2"} Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.804930 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f0334e5-add1-4ced-bad4-7e77d528e28a","Type":"ContainerStarted","Data":"c42fd3d738393028009cc08e6486c477c519e8a64222f63be983b32c5d0c9266"} Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.810234 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c48c76d4d-blnpd" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.811101 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "60010c1d-09a2-4f43-9f80-c896d4d02945" (UID: "60010c1d-09a2-4f43-9f80-c896d4d02945"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.819200 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c48c76d4d-blnpd" event={"ID":"d658fa13-ec8c-4936-9015-531912d2e050","Type":"ContainerDied","Data":"fd3df1bfa21272382b244cc320e9f6e922ffe1c1e42e8be6f586487e918cdac8"} Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.824919 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60010c1d-09a2-4f43-9f80-c896d4d02945-kube-api-access-6x542" (OuterVolumeSpecName: "kube-api-access-6x542") pod "60010c1d-09a2-4f43-9f80-c896d4d02945" (UID: "60010c1d-09a2-4f43-9f80-c896d4d02945"). InnerVolumeSpecName "kube-api-access-6x542". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.839430 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84f6c74b66-trbb9"] Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.843927 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x542\" (UniqueName: \"kubernetes.io/projected/60010c1d-09a2-4f43-9f80-c896d4d02945-kube-api-access-6x542\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.843953 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.868041 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-84f6c74b66-trbb9"] Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.888161 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ljzz5"] Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.897564 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c48c76d4d-blnpd"] Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.908460 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5c48c76d4d-blnpd"] Oct 01 13:26:26 crc kubenswrapper[4749]: I1001 13:26:26.992578 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hghc5"] Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.018394 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-config" (OuterVolumeSpecName: "config") pod "60010c1d-09a2-4f43-9f80-c896d4d02945" (UID: "60010c1d-09a2-4f43-9f80-c896d4d02945"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.037725 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60010c1d-09a2-4f43-9f80-c896d4d02945" (UID: "60010c1d-09a2-4f43-9f80-c896d4d02945"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.043145 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "60010c1d-09a2-4f43-9f80-c896d4d02945" (UID: "60010c1d-09a2-4f43-9f80-c896d4d02945"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.048012 4749 scope.go:117] "RemoveContainer" containerID="070469cc902a7a3114f0e257a6257c4a6d561f9bd6e5bdf6b9ee86f6304d2e5f" Oct 01 13:26:27 crc kubenswrapper[4749]: W1001 13:26:27.056671 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ab26a09_3ca5_4568_b097_b6c4cfa6c8b6.slice/crio-8e477d0b26298920d0322e045e7a4a3b35486542f299e32e15378f93952f6db0 WatchSource:0}: Error finding container 8e477d0b26298920d0322e045e7a4a3b35486542f299e32e15378f93952f6db0: Status 404 returned error can't find the container with id 8e477d0b26298920d0322e045e7a4a3b35486542f299e32e15378f93952f6db0 Oct 01 13:26:27 crc kubenswrapper[4749]: W1001 13:26:27.061477 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b1f0790_fd61_4075_b031_cd82fa151ab8.slice/crio-a239cc7ff1e95d3d1f21534711a1fa50cce3830eb5fa8df21264740ad32e8d15 WatchSource:0}: Error finding container a239cc7ff1e95d3d1f21534711a1fa50cce3830eb5fa8df21264740ad32e8d15: Status 404 returned error can't find the container with id a239cc7ff1e95d3d1f21534711a1fa50cce3830eb5fa8df21264740ad32e8d15 Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.067599 4749 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.067626 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.067641 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/60010c1d-09a2-4f43-9f80-c896d4d02945-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.220551 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-d8mdh"] Oct 01 13:26:27 crc kubenswrapper[4749]: W1001 13:26:27.227554 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cf66c1c_176c_4671_9887_07295eb47200.slice/crio-bc00e2bc138182ff885f4c0728824eb36e524caba22086a9506247d08c91c511 WatchSource:0}: Error finding container bc00e2bc138182ff885f4c0728824eb36e524caba22086a9506247d08c91c511: Status 404 returned error can't find the container with id bc00e2bc138182ff885f4c0728824eb36e524caba22086a9506247d08c91c511 Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.264171 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae81866-7b3d-44f4-a4fd-5b49e2223352" path="/var/lib/kubelet/pods/7ae81866-7b3d-44f4-a4fd-5b49e2223352/volumes" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.264973 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7be5a817-c26e-4fcf-8da2-d61f9b58c2a6" path="/var/lib/kubelet/pods/7be5a817-c26e-4fcf-8da2-d61f9b58c2a6/volumes" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.265854 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c79b5ea5-4a70-4868-b72a-bc8efb0cb967" path="/var/lib/kubelet/pods/c79b5ea5-4a70-4868-b72a-bc8efb0cb967/volumes" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.269497 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d658fa13-ec8c-4936-9015-531912d2e050" path="/var/lib/kubelet/pods/d658fa13-ec8c-4936-9015-531912d2e050/volumes" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.295469 4749 scope.go:117] "RemoveContainer" containerID="10e12106c5dfffa2cadb4bca1b1d999b0999c41bc1f59a64d6b037fd834db6ed" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.316567 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-58f698d9cb-ntlhp"] Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.335079 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-58f698d9cb-ntlhp"] Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.339880 4749 scope.go:117] "RemoveContainer" containerID="f8206c69be3f6ecbb17fa47a68cfad645dc99620b381ea0e0e35641265e10873" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.414419 4749 scope.go:117] "RemoveContainer" containerID="21c48d30d51810699355b4aafbf313ede9fab00195b8e4464bf27bbe0323f82d" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.429783 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:26:27 crc kubenswrapper[4749]: W1001 13:26:27.443399 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81ccc37c_c732_4d54_b526_b70f51a5af62.slice/crio-ff21d934b09191f5abcb85e73e74cf07538e1519004910a48c049070efd1a6b0 WatchSource:0}: Error finding container ff21d934b09191f5abcb85e73e74cf07538e1519004910a48c049070efd1a6b0: Status 404 returned error can't find the container with id ff21d934b09191f5abcb85e73e74cf07538e1519004910a48c049070efd1a6b0 Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.457241 4749 scope.go:117] "RemoveContainer" containerID="3b4f2110a7200bbd3dea79b60ae19d03800b62cc9162c4b42d603ece72c13f52" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.818746 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-f7f865789-9mjp6" podUID="c79b5ea5-4a70-4868-b72a-bc8efb0cb967" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: i/o timeout" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.894808 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85b88cb8b7-2gzx9" event={"ID":"6139ffc4-c70f-45d5-aa79-6fc7b79f2034","Type":"ContainerStarted","Data":"b4dae76a935addcb9b9f8a5f0e8de8e81a81d2bca965d2b5b0b7bd6f3ed9b181"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.898164 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b55ebe69-1518-428b-9ceb-383de60316cc","Type":"ContainerStarted","Data":"925a15059a85076c5763a201e7a2b7f30a158109032768bfb03996ac57810856"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.902132 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" event={"ID":"a27e333f-57a3-4257-9e49-e03928cfa02d","Type":"ContainerStarted","Data":"8581c4d8a78a26f55da196cb25d373c03c51e270ddc83e15e91b7fb06b7593ed"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.902157 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" event={"ID":"a27e333f-57a3-4257-9e49-e03928cfa02d","Type":"ContainerStarted","Data":"b08e7382af5f5f9775414ef91dc95f216e704a070be1c2d7c338c8e814e8b386"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.904028 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5c6dfb5585-x78z5" event={"ID":"c8b1d3a9-044c-475f-b86f-7e099e2b1197","Type":"ContainerStarted","Data":"650b57a9024d94ca39a91dd70e1becdd8318df905340ef6b03f5fbe5b1ed7cfc"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.905636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81ccc37c-c732-4d54-b526-b70f51a5af62","Type":"ContainerStarted","Data":"ff21d934b09191f5abcb85e73e74cf07538e1519004910a48c049070efd1a6b0"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.907959 4749 generic.go:334] "Generic (PLEG): container finished" podID="9b1f0790-fd61-4075-b031-cd82fa151ab8" containerID="b61e5a52990785704207c03519092fd1407992b2785b3eb0df19e683d390801f" exitCode=0 Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.908024 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hghc5" event={"ID":"9b1f0790-fd61-4075-b031-cd82fa151ab8","Type":"ContainerDied","Data":"b61e5a52990785704207c03519092fd1407992b2785b3eb0df19e683d390801f"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.908040 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hghc5" event={"ID":"9b1f0790-fd61-4075-b031-cd82fa151ab8","Type":"ContainerStarted","Data":"a239cc7ff1e95d3d1f21534711a1fa50cce3830eb5fa8df21264740ad32e8d15"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.909737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f0334e5-add1-4ced-bad4-7e77d528e28a","Type":"ContainerStarted","Data":"3724b72c95f73116a157ff981fdb65ad6cdbea28bbb11e94e6f19e9c20c69001"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.923284 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tc29m" event={"ID":"d9ead71f-c58e-4634-96f9-81c9b165e24c","Type":"ContainerStarted","Data":"ec82d5bfc32297ac60d77fde66e8e92e7ae2c1287ead785bfa8106988027fb00"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.929031 4749 generic.go:334] "Generic (PLEG): container finished" podID="0cf66c1c-176c-4671-9887-07295eb47200" containerID="3057dffa0bfb34dc4a25d3f4a9238802eb20368a4a8820bd259e900722093104" exitCode=0 Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.929102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d8mdh" event={"ID":"0cf66c1c-176c-4671-9887-07295eb47200","Type":"ContainerDied","Data":"3057dffa0bfb34dc4a25d3f4a9238802eb20368a4a8820bd259e900722093104"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.929125 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d8mdh" event={"ID":"0cf66c1c-176c-4671-9887-07295eb47200","Type":"ContainerStarted","Data":"bc00e2bc138182ff885f4c0728824eb36e524caba22086a9506247d08c91c511"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.939415 4749 generic.go:334] "Generic (PLEG): container finished" podID="3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6" containerID="772eb734a26324a9b5d89a79a2165220c78dcb2ac21997af68fd098ccdc9f490" exitCode=0 Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.939462 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ljzz5" event={"ID":"3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6","Type":"ContainerDied","Data":"772eb734a26324a9b5d89a79a2165220c78dcb2ac21997af68fd098ccdc9f490"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.939489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ljzz5" event={"ID":"3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6","Type":"ContainerStarted","Data":"8e477d0b26298920d0322e045e7a4a3b35486542f299e32e15378f93952f6db0"} Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.941733 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.622489654 podStartE2EDuration="23.941705251s" podCreationTimestamp="2025-10-01 13:26:04 +0000 UTC" firstStartedPulling="2025-10-01 13:26:06.11504501 +0000 UTC m=+1226.169029909" lastFinishedPulling="2025-10-01 13:26:25.434260607 +0000 UTC m=+1245.488245506" observedRunningTime="2025-10-01 13:26:27.919214085 +0000 UTC m=+1247.973198984" watchObservedRunningTime="2025-10-01 13:26:27.941705251 +0000 UTC m=+1247.995690150" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.942367 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5c6dfb5585-x78z5" podStartSLOduration=14.545624128 podStartE2EDuration="33.942356431s" podCreationTimestamp="2025-10-01 13:25:54 +0000 UTC" firstStartedPulling="2025-10-01 13:26:05.80686403 +0000 UTC m=+1225.860848929" lastFinishedPulling="2025-10-01 13:26:25.203596333 +0000 UTC m=+1245.257581232" observedRunningTime="2025-10-01 13:26:27.933443417 +0000 UTC m=+1247.987428316" watchObservedRunningTime="2025-10-01 13:26:27.942356431 +0000 UTC m=+1247.996341350" Oct 01 13:26:27 crc kubenswrapper[4749]: I1001 13:26:27.962905 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=8.962886099 podStartE2EDuration="8.962886099s" podCreationTimestamp="2025-10-01 13:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:27.957419287 +0000 UTC m=+1248.011404186" watchObservedRunningTime="2025-10-01 13:26:27.962886099 +0000 UTC m=+1248.016870998" Oct 01 13:26:28 crc kubenswrapper[4749]: I1001 13:26:28.004164 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5fdf7b5778-vxx8p" podStartSLOduration=14.583362955 podStartE2EDuration="34.004144631s" podCreationTimestamp="2025-10-01 13:25:54 +0000 UTC" firstStartedPulling="2025-10-01 13:26:05.831484959 +0000 UTC m=+1225.885469868" lastFinishedPulling="2025-10-01 13:26:25.252266645 +0000 UTC m=+1245.306251544" observedRunningTime="2025-10-01 13:26:27.998595527 +0000 UTC m=+1248.052580426" watchObservedRunningTime="2025-10-01 13:26:28.004144631 +0000 UTC m=+1248.058129530" Oct 01 13:26:28 crc kubenswrapper[4749]: I1001 13:26:28.053823 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-tc29m" podStartSLOduration=4.551603672 podStartE2EDuration="1m1.053806972s" podCreationTimestamp="2025-10-01 13:25:27 +0000 UTC" firstStartedPulling="2025-10-01 13:25:28.825809299 +0000 UTC m=+1188.879794198" lastFinishedPulling="2025-10-01 13:26:25.328012599 +0000 UTC m=+1245.381997498" observedRunningTime="2025-10-01 13:26:28.050397541 +0000 UTC m=+1248.104382440" watchObservedRunningTime="2025-10-01 13:26:28.053806972 +0000 UTC m=+1248.107791871" Oct 01 13:26:28 crc kubenswrapper[4749]: I1001 13:26:28.409208 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:26:28 crc kubenswrapper[4749]: I1001 13:26:28.962107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81ccc37c-c732-4d54-b526-b70f51a5af62","Type":"ContainerStarted","Data":"8c6f2c8133bb9fa384ce08f9a8d3dc16a3bd5486d4520089b8ff355faf77d338"} Oct 01 13:26:28 crc kubenswrapper[4749]: I1001 13:26:28.965187 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-85b88cb8b7-2gzx9" event={"ID":"6139ffc4-c70f-45d5-aa79-6fc7b79f2034","Type":"ContainerStarted","Data":"ca276d7d5fa81537ea30bc81a2767cddaf9baa8b4c3f9ed9e71d8fa1e8fa93db"} Oct 01 13:26:28 crc kubenswrapper[4749]: I1001 13:26:28.966491 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:28 crc kubenswrapper[4749]: I1001 13:26:28.966529 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:28 crc kubenswrapper[4749]: I1001 13:26:28.970574 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32fe2369-6499-4f14-8c75-33933c8bb608","Type":"ContainerStarted","Data":"298d7af4b919df3b1a71513ec6a0a642bd07a773a298a78749c0b2597d1bce01"} Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.247517 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" path="/var/lib/kubelet/pods/60010c1d-09a2-4f43-9f80-c896d4d02945/volumes" Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.553428 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ljzz5" Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.573721 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-85b88cb8b7-2gzx9" podStartSLOduration=16.57370003 podStartE2EDuration="16.57370003s" podCreationTimestamp="2025-10-01 13:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:28.997770688 +0000 UTC m=+1249.051755597" watchObservedRunningTime="2025-10-01 13:26:29.57370003 +0000 UTC m=+1249.627684919" Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.622822 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc752\" (UniqueName: \"kubernetes.io/projected/3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6-kube-api-access-nc752\") pod \"3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6\" (UID: \"3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6\") " Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.629604 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6-kube-api-access-nc752" (OuterVolumeSpecName: "kube-api-access-nc752") pod "3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6" (UID: "3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6"). InnerVolumeSpecName "kube-api-access-nc752". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.689548 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hghc5" Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.693534 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d8mdh" Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.725600 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-422r9\" (UniqueName: \"kubernetes.io/projected/9b1f0790-fd61-4075-b031-cd82fa151ab8-kube-api-access-422r9\") pod \"9b1f0790-fd61-4075-b031-cd82fa151ab8\" (UID: \"9b1f0790-fd61-4075-b031-cd82fa151ab8\") " Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.725775 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wffn\" (UniqueName: \"kubernetes.io/projected/0cf66c1c-176c-4671-9887-07295eb47200-kube-api-access-6wffn\") pod \"0cf66c1c-176c-4671-9887-07295eb47200\" (UID: \"0cf66c1c-176c-4671-9887-07295eb47200\") " Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.726338 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc752\" (UniqueName: \"kubernetes.io/projected/3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6-kube-api-access-nc752\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.731322 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cf66c1c-176c-4671-9887-07295eb47200-kube-api-access-6wffn" (OuterVolumeSpecName: "kube-api-access-6wffn") pod "0cf66c1c-176c-4671-9887-07295eb47200" (UID: "0cf66c1c-176c-4671-9887-07295eb47200"). InnerVolumeSpecName "kube-api-access-6wffn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.731504 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1f0790-fd61-4075-b031-cd82fa151ab8-kube-api-access-422r9" (OuterVolumeSpecName: "kube-api-access-422r9") pod "9b1f0790-fd61-4075-b031-cd82fa151ab8" (UID: "9b1f0790-fd61-4075-b031-cd82fa151ab8"). InnerVolumeSpecName "kube-api-access-422r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.828085 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-422r9\" (UniqueName: \"kubernetes.io/projected/9b1f0790-fd61-4075-b031-cd82fa151ab8-kube-api-access-422r9\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.828317 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wffn\" (UniqueName: \"kubernetes.io/projected/0cf66c1c-176c-4671-9887-07295eb47200-kube-api-access-6wffn\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.972931 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:29 crc kubenswrapper[4749]: I1001 13:26:29.979749 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.004655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32fe2369-6499-4f14-8c75-33933c8bb608","Type":"ContainerStarted","Data":"4f834acbac422d949bb014d80000661055eae59e0ec3411b3a78201e0a97adad"} Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.004717 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="32fe2369-6499-4f14-8c75-33933c8bb608" containerName="glance-log" containerID="cri-o://298d7af4b919df3b1a71513ec6a0a642bd07a773a298a78749c0b2597d1bce01" gracePeriod=30 Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.004789 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="32fe2369-6499-4f14-8c75-33933c8bb608" containerName="glance-httpd" containerID="cri-o://4f834acbac422d949bb014d80000661055eae59e0ec3411b3a78201e0a97adad" gracePeriod=30 Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.007614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81ccc37c-c732-4d54-b526-b70f51a5af62","Type":"ContainerStarted","Data":"fd1307259f58642a9b1989cf41778ef38302632636318edc11bb895abe20bf6b"} Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.010922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hghc5" event={"ID":"9b1f0790-fd61-4075-b031-cd82fa151ab8","Type":"ContainerDied","Data":"a239cc7ff1e95d3d1f21534711a1fa50cce3830eb5fa8df21264740ad32e8d15"} Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.010969 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a239cc7ff1e95d3d1f21534711a1fa50cce3830eb5fa8df21264740ad32e8d15" Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.011043 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hghc5" Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.015762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-d8mdh" event={"ID":"0cf66c1c-176c-4671-9887-07295eb47200","Type":"ContainerDied","Data":"bc00e2bc138182ff885f4c0728824eb36e524caba22086a9506247d08c91c511"} Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.015806 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc00e2bc138182ff885f4c0728824eb36e524caba22086a9506247d08c91c511" Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.015884 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-d8mdh" Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.024858 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ljzz5" Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.028112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ljzz5" event={"ID":"3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6","Type":"ContainerDied","Data":"8e477d0b26298920d0322e045e7a4a3b35486542f299e32e15378f93952f6db0"} Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.028146 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e477d0b26298920d0322e045e7a4a3b35486542f299e32e15378f93952f6db0" Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.038381 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.038361406 podStartE2EDuration="11.038361406s" podCreationTimestamp="2025-10-01 13:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:30.034536043 +0000 UTC m=+1250.088520952" watchObservedRunningTime="2025-10-01 13:26:30.038361406 +0000 UTC m=+1250.092346305" Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.046230 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.069003 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.068985733 podStartE2EDuration="5.068985733s" podCreationTimestamp="2025-10-01 13:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:30.061882223 +0000 UTC m=+1250.115867122" watchObservedRunningTime="2025-10-01 13:26:30.068985733 +0000 UTC m=+1250.122970632" Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.468654 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c48c76d4d-blnpd" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:26:30 crc kubenswrapper[4749]: I1001 13:26:30.469318 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c48c76d4d-blnpd" podUID="d658fa13-ec8c-4936-9015-531912d2e050" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": dial tcp 10.217.0.178:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.045704 4749 generic.go:334] "Generic (PLEG): container finished" podID="32fe2369-6499-4f14-8c75-33933c8bb608" containerID="4f834acbac422d949bb014d80000661055eae59e0ec3411b3a78201e0a97adad" exitCode=143 Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.046048 4749 generic.go:334] "Generic (PLEG): container finished" podID="32fe2369-6499-4f14-8c75-33933c8bb608" containerID="298d7af4b919df3b1a71513ec6a0a642bd07a773a298a78749c0b2597d1bce01" exitCode=143 Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.046205 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32fe2369-6499-4f14-8c75-33933c8bb608","Type":"ContainerDied","Data":"4f834acbac422d949bb014d80000661055eae59e0ec3411b3a78201e0a97adad"} Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.046297 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32fe2369-6499-4f14-8c75-33933c8bb608","Type":"ContainerDied","Data":"298d7af4b919df3b1a71513ec6a0a642bd07a773a298a78749c0b2597d1bce01"} Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.046335 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="81ccc37c-c732-4d54-b526-b70f51a5af62" containerName="glance-log" containerID="cri-o://8c6f2c8133bb9fa384ce08f9a8d3dc16a3bd5486d4520089b8ff355faf77d338" gracePeriod=30 Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.046483 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="81ccc37c-c732-4d54-b526-b70f51a5af62" containerName="glance-httpd" containerID="cri-o://fd1307259f58642a9b1989cf41778ef38302632636318edc11bb895abe20bf6b" gracePeriod=30 Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.048088 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:31 crc kubenswrapper[4749]: E1001 13:26:31.050825 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3724b72c95f73116a157ff981fdb65ad6cdbea28bbb11e94e6f19e9c20c69001 is running failed: container process not found" containerID="3724b72c95f73116a157ff981fdb65ad6cdbea28bbb11e94e6f19e9c20c69001" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 01 13:26:31 crc kubenswrapper[4749]: E1001 13:26:31.053046 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3724b72c95f73116a157ff981fdb65ad6cdbea28bbb11e94e6f19e9c20c69001 is running failed: container process not found" containerID="3724b72c95f73116a157ff981fdb65ad6cdbea28bbb11e94e6f19e9c20c69001" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 01 13:26:31 crc kubenswrapper[4749]: E1001 13:26:31.065270 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3724b72c95f73116a157ff981fdb65ad6cdbea28bbb11e94e6f19e9c20c69001 is running failed: container process not found" containerID="3724b72c95f73116a157ff981fdb65ad6cdbea28bbb11e94e6f19e9c20c69001" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 01 13:26:31 crc kubenswrapper[4749]: E1001 13:26:31.065322 4749 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3724b72c95f73116a157ff981fdb65ad6cdbea28bbb11e94e6f19e9c20c69001 is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="3f0334e5-add1-4ced-bad4-7e77d528e28a" containerName="watcher-decision-engine" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.467295 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.567479 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-public-tls-certs\") pod \"32fe2369-6499-4f14-8c75-33933c8bb608\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.567580 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-combined-ca-bundle\") pod \"32fe2369-6499-4f14-8c75-33933c8bb608\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.567601 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-scripts\") pod \"32fe2369-6499-4f14-8c75-33933c8bb608\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.567633 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32fe2369-6499-4f14-8c75-33933c8bb608-logs\") pod \"32fe2369-6499-4f14-8c75-33933c8bb608\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.567661 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"32fe2369-6499-4f14-8c75-33933c8bb608\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.567677 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-config-data\") pod \"32fe2369-6499-4f14-8c75-33933c8bb608\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.567760 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32fe2369-6499-4f14-8c75-33933c8bb608-httpd-run\") pod \"32fe2369-6499-4f14-8c75-33933c8bb608\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.567794 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcz9r\" (UniqueName: \"kubernetes.io/projected/32fe2369-6499-4f14-8c75-33933c8bb608-kube-api-access-hcz9r\") pod \"32fe2369-6499-4f14-8c75-33933c8bb608\" (UID: \"32fe2369-6499-4f14-8c75-33933c8bb608\") " Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.568195 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32fe2369-6499-4f14-8c75-33933c8bb608-logs" (OuterVolumeSpecName: "logs") pod "32fe2369-6499-4f14-8c75-33933c8bb608" (UID: "32fe2369-6499-4f14-8c75-33933c8bb608"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.568182 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32fe2369-6499-4f14-8c75-33933c8bb608-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "32fe2369-6499-4f14-8c75-33933c8bb608" (UID: "32fe2369-6499-4f14-8c75-33933c8bb608"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.572535 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "32fe2369-6499-4f14-8c75-33933c8bb608" (UID: "32fe2369-6499-4f14-8c75-33933c8bb608"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.575337 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-scripts" (OuterVolumeSpecName: "scripts") pod "32fe2369-6499-4f14-8c75-33933c8bb608" (UID: "32fe2369-6499-4f14-8c75-33933c8bb608"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.591474 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32fe2369-6499-4f14-8c75-33933c8bb608-kube-api-access-hcz9r" (OuterVolumeSpecName: "kube-api-access-hcz9r") pod "32fe2369-6499-4f14-8c75-33933c8bb608" (UID: "32fe2369-6499-4f14-8c75-33933c8bb608"). InnerVolumeSpecName "kube-api-access-hcz9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.607873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32fe2369-6499-4f14-8c75-33933c8bb608" (UID: "32fe2369-6499-4f14-8c75-33933c8bb608"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.631326 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "32fe2369-6499-4f14-8c75-33933c8bb608" (UID: "32fe2369-6499-4f14-8c75-33933c8bb608"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.637260 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-config-data" (OuterVolumeSpecName: "config-data") pod "32fe2369-6499-4f14-8c75-33933c8bb608" (UID: "32fe2369-6499-4f14-8c75-33933c8bb608"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.669414 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32fe2369-6499-4f14-8c75-33933c8bb608-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.669445 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcz9r\" (UniqueName: \"kubernetes.io/projected/32fe2369-6499-4f14-8c75-33933c8bb608-kube-api-access-hcz9r\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.669457 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.669467 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.669475 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.669483 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32fe2369-6499-4f14-8c75-33933c8bb608-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.669514 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.669524 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32fe2369-6499-4f14-8c75-33933c8bb608-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.698573 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 01 13:26:31 crc kubenswrapper[4749]: I1001 13:26:31.771815 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.064376 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32fe2369-6499-4f14-8c75-33933c8bb608","Type":"ContainerDied","Data":"8cf7afb6f23b9d89f42e17ee55ee06d743e43b8d163f81ab9529b09781afd762"} Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.064419 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.064436 4749 scope.go:117] "RemoveContainer" containerID="4f834acbac422d949bb014d80000661055eae59e0ec3411b3a78201e0a97adad" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.067150 4749 generic.go:334] "Generic (PLEG): container finished" podID="81ccc37c-c732-4d54-b526-b70f51a5af62" containerID="fd1307259f58642a9b1989cf41778ef38302632636318edc11bb895abe20bf6b" exitCode=0 Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.067181 4749 generic.go:334] "Generic (PLEG): container finished" podID="81ccc37c-c732-4d54-b526-b70f51a5af62" containerID="8c6f2c8133bb9fa384ce08f9a8d3dc16a3bd5486d4520089b8ff355faf77d338" exitCode=143 Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.067278 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81ccc37c-c732-4d54-b526-b70f51a5af62","Type":"ContainerDied","Data":"fd1307259f58642a9b1989cf41778ef38302632636318edc11bb895abe20bf6b"} Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.067310 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81ccc37c-c732-4d54-b526-b70f51a5af62","Type":"ContainerDied","Data":"8c6f2c8133bb9fa384ce08f9a8d3dc16a3bd5486d4520089b8ff355faf77d338"} Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.071374 4749 generic.go:334] "Generic (PLEG): container finished" podID="3f0334e5-add1-4ced-bad4-7e77d528e28a" containerID="3724b72c95f73116a157ff981fdb65ad6cdbea28bbb11e94e6f19e9c20c69001" exitCode=1 Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.071401 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f0334e5-add1-4ced-bad4-7e77d528e28a","Type":"ContainerDied","Data":"3724b72c95f73116a157ff981fdb65ad6cdbea28bbb11e94e6f19e9c20c69001"} Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.072054 4749 scope.go:117] "RemoveContainer" containerID="3724b72c95f73116a157ff981fdb65ad6cdbea28bbb11e94e6f19e9c20c69001" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.116050 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.132768 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.144991 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:26:32 crc kubenswrapper[4749]: E1001 13:26:32.145435 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerName="neutron-api" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145455 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerName="neutron-api" Oct 01 13:26:32 crc kubenswrapper[4749]: E1001 13:26:32.145470 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6" containerName="mariadb-database-create" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145478 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6" containerName="mariadb-database-create" Oct 01 13:26:32 crc kubenswrapper[4749]: E1001 13:26:32.145504 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cf66c1c-176c-4671-9887-07295eb47200" containerName="mariadb-database-create" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145512 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cf66c1c-176c-4671-9887-07295eb47200" containerName="mariadb-database-create" Oct 01 13:26:32 crc kubenswrapper[4749]: E1001 13:26:32.145523 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32fe2369-6499-4f14-8c75-33933c8bb608" containerName="glance-httpd" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145530 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fe2369-6499-4f14-8c75-33933c8bb608" containerName="glance-httpd" Oct 01 13:26:32 crc kubenswrapper[4749]: E1001 13:26:32.145544 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1f0790-fd61-4075-b031-cd82fa151ab8" containerName="mariadb-database-create" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145553 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1f0790-fd61-4075-b031-cd82fa151ab8" containerName="mariadb-database-create" Oct 01 13:26:32 crc kubenswrapper[4749]: E1001 13:26:32.145565 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32fe2369-6499-4f14-8c75-33933c8bb608" containerName="glance-log" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145572 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fe2369-6499-4f14-8c75-33933c8bb608" containerName="glance-log" Oct 01 13:26:32 crc kubenswrapper[4749]: E1001 13:26:32.145601 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerName="neutron-httpd" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145608 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerName="neutron-httpd" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145880 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6" containerName="mariadb-database-create" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145905 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cf66c1c-176c-4671-9887-07295eb47200" containerName="mariadb-database-create" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145917 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerName="neutron-httpd" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145947 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1f0790-fd61-4075-b031-cd82fa151ab8" containerName="mariadb-database-create" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145956 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="60010c1d-09a2-4f43-9f80-c896d4d02945" containerName="neutron-api" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.145966 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="32fe2369-6499-4f14-8c75-33933c8bb608" containerName="glance-httpd" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.146441 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="32fe2369-6499-4f14-8c75-33933c8bb608" containerName="glance-log" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.147425 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.149346 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.149562 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.165541 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.279784 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc78bfd1-472f-4d64-b48c-7b986bee129a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.279838 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc78bfd1-472f-4d64-b48c-7b986bee129a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.279891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc78bfd1-472f-4d64-b48c-7b986bee129a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.280033 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc78bfd1-472f-4d64-b48c-7b986bee129a-scripts\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.280126 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.280152 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc78bfd1-472f-4d64-b48c-7b986bee129a-logs\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.280321 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc78bfd1-472f-4d64-b48c-7b986bee129a-config-data\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.280496 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7dct\" (UniqueName: \"kubernetes.io/projected/bc78bfd1-472f-4d64-b48c-7b986bee129a-kube-api-access-l7dct\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.382882 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc78bfd1-472f-4d64-b48c-7b986bee129a-logs\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.382953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc78bfd1-472f-4d64-b48c-7b986bee129a-config-data\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.383036 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7dct\" (UniqueName: \"kubernetes.io/projected/bc78bfd1-472f-4d64-b48c-7b986bee129a-kube-api-access-l7dct\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.383119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc78bfd1-472f-4d64-b48c-7b986bee129a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.383150 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc78bfd1-472f-4d64-b48c-7b986bee129a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.383172 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc78bfd1-472f-4d64-b48c-7b986bee129a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.383254 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc78bfd1-472f-4d64-b48c-7b986bee129a-scripts\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.383320 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.383463 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc78bfd1-472f-4d64-b48c-7b986bee129a-logs\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.383629 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc78bfd1-472f-4d64-b48c-7b986bee129a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.383705 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.387744 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc78bfd1-472f-4d64-b48c-7b986bee129a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.387919 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc78bfd1-472f-4d64-b48c-7b986bee129a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.389522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc78bfd1-472f-4d64-b48c-7b986bee129a-config-data\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.391205 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc78bfd1-472f-4d64-b48c-7b986bee129a-scripts\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.403075 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7dct\" (UniqueName: \"kubernetes.io/projected/bc78bfd1-472f-4d64-b48c-7b986bee129a-kube-api-access-l7dct\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.422273 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"bc78bfd1-472f-4d64-b48c-7b986bee129a\") " pod="openstack/glance-default-external-api-0" Oct 01 13:26:32 crc kubenswrapper[4749]: I1001 13:26:32.465803 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.082851 4749 scope.go:117] "RemoveContainer" containerID="298d7af4b919df3b1a71513ec6a0a642bd07a773a298a78749c0b2597d1bce01" Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.272052 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32fe2369-6499-4f14-8c75-33933c8bb608" path="/var/lib/kubelet/pods/32fe2369-6499-4f14-8c75-33933c8bb608/volumes" Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.730437 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.830336 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.924716 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-config-data\") pod \"81ccc37c-c732-4d54-b526-b70f51a5af62\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.924760 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc2vx\" (UniqueName: \"kubernetes.io/projected/81ccc37c-c732-4d54-b526-b70f51a5af62-kube-api-access-rc2vx\") pod \"81ccc37c-c732-4d54-b526-b70f51a5af62\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.924853 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-scripts\") pod \"81ccc37c-c732-4d54-b526-b70f51a5af62\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.924876 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-combined-ca-bundle\") pod \"81ccc37c-c732-4d54-b526-b70f51a5af62\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.924917 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-internal-tls-certs\") pod \"81ccc37c-c732-4d54-b526-b70f51a5af62\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.924959 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81ccc37c-c732-4d54-b526-b70f51a5af62-httpd-run\") pod \"81ccc37c-c732-4d54-b526-b70f51a5af62\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.924993 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81ccc37c-c732-4d54-b526-b70f51a5af62-logs\") pod \"81ccc37c-c732-4d54-b526-b70f51a5af62\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.925038 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"81ccc37c-c732-4d54-b526-b70f51a5af62\" (UID: \"81ccc37c-c732-4d54-b526-b70f51a5af62\") " Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.925852 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81ccc37c-c732-4d54-b526-b70f51a5af62-logs" (OuterVolumeSpecName: "logs") pod "81ccc37c-c732-4d54-b526-b70f51a5af62" (UID: "81ccc37c-c732-4d54-b526-b70f51a5af62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.925895 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81ccc37c-c732-4d54-b526-b70f51a5af62-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "81ccc37c-c732-4d54-b526-b70f51a5af62" (UID: "81ccc37c-c732-4d54-b526-b70f51a5af62"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.932168 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "81ccc37c-c732-4d54-b526-b70f51a5af62" (UID: "81ccc37c-c732-4d54-b526-b70f51a5af62"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.932695 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-scripts" (OuterVolumeSpecName: "scripts") pod "81ccc37c-c732-4d54-b526-b70f51a5af62" (UID: "81ccc37c-c732-4d54-b526-b70f51a5af62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.952362 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ccc37c-c732-4d54-b526-b70f51a5af62-kube-api-access-rc2vx" (OuterVolumeSpecName: "kube-api-access-rc2vx") pod "81ccc37c-c732-4d54-b526-b70f51a5af62" (UID: "81ccc37c-c732-4d54-b526-b70f51a5af62"). InnerVolumeSpecName "kube-api-access-rc2vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:33 crc kubenswrapper[4749]: I1001 13:26:33.986615 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81ccc37c-c732-4d54-b526-b70f51a5af62" (UID: "81ccc37c-c732-4d54-b526-b70f51a5af62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.007369 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "81ccc37c-c732-4d54-b526-b70f51a5af62" (UID: "81ccc37c-c732-4d54-b526-b70f51a5af62"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.021072 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.022287 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-config-data" (OuterVolumeSpecName: "config-data") pod "81ccc37c-c732-4d54-b526-b70f51a5af62" (UID: "81ccc37c-c732-4d54-b526-b70f51a5af62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.022571 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-85b88cb8b7-2gzx9" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.028030 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.028062 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc2vx\" (UniqueName: \"kubernetes.io/projected/81ccc37c-c732-4d54-b526-b70f51a5af62-kube-api-access-rc2vx\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.028071 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.028081 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.028090 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/81ccc37c-c732-4d54-b526-b70f51a5af62-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.028098 4749 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/81ccc37c-c732-4d54-b526-b70f51a5af62-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.028106 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/81ccc37c-c732-4d54-b526-b70f51a5af62-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.028135 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.048130 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.117260 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bc78bfd1-472f-4d64-b48c-7b986bee129a","Type":"ContainerStarted","Data":"d33d7ece9b484460a2b6bfcd2e67861bc38876238730d8c1aa455f29d00a2af6"} Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.123063 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"3f0334e5-add1-4ced-bad4-7e77d528e28a","Type":"ContainerStarted","Data":"43a3aaa406406f2c01350a4f99a9e9972521bd4f75669f5d27d92ab5cc88364c"} Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.129568 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.136997 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"81ccc37c-c732-4d54-b526-b70f51a5af62","Type":"ContainerDied","Data":"ff21d934b09191f5abcb85e73e74cf07538e1519004910a48c049070efd1a6b0"} Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.137072 4749 scope.go:117] "RemoveContainer" containerID="fd1307259f58642a9b1989cf41778ef38302632636318edc11bb895abe20bf6b" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.137293 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.184340 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.192622 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.229951 4749 scope.go:117] "RemoveContainer" containerID="8c6f2c8133bb9fa384ce08f9a8d3dc16a3bd5486d4520089b8ff355faf77d338" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.242335 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:26:34 crc kubenswrapper[4749]: E1001 13:26:34.242843 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ccc37c-c732-4d54-b526-b70f51a5af62" containerName="glance-log" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.242862 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ccc37c-c732-4d54-b526-b70f51a5af62" containerName="glance-log" Oct 01 13:26:34 crc kubenswrapper[4749]: E1001 13:26:34.242897 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ccc37c-c732-4d54-b526-b70f51a5af62" containerName="glance-httpd" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.242906 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ccc37c-c732-4d54-b526-b70f51a5af62" containerName="glance-httpd" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.243093 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ccc37c-c732-4d54-b526-b70f51a5af62" containerName="glance-log" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.243118 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ccc37c-c732-4d54-b526-b70f51a5af62" containerName="glance-httpd" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.249491 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.250740 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.258690 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.258971 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.333131 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.333204 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.333244 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.333354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss2pk\" (UniqueName: \"kubernetes.io/projected/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-kube-api-access-ss2pk\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.333381 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-logs\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.333409 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.333425 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.333448 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.435270 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.435586 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.435701 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss2pk\" (UniqueName: \"kubernetes.io/projected/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-kube-api-access-ss2pk\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.435717 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.435732 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-logs\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.435766 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.435783 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.435827 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.435846 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.438529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-logs\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.439871 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.441438 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.441788 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.443771 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.445112 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.453773 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss2pk\" (UniqueName: \"kubernetes.io/projected/495b97d9-1d27-4e6e-a857-ee6cfdf6dffa-kube-api-access-ss2pk\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.480530 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa\") " pod="openstack/glance-default-internal-api-0" Oct 01 13:26:34 crc kubenswrapper[4749]: I1001 13:26:34.593159 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.154553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bc78bfd1-472f-4d64-b48c-7b986bee129a","Type":"ContainerStarted","Data":"95ff7b9e89f125871c0728723e34023049c2c61fc2e71565cefd37a6aeb45fb7"} Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.247364 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ccc37c-c732-4d54-b526-b70f51a5af62" path="/var/lib/kubelet/pods/81ccc37c-c732-4d54-b526-b70f51a5af62/volumes" Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.278500 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.628107 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5927-account-create-rtd62"] Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.629319 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5927-account-create-rtd62" Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.631809 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.650158 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5927-account-create-rtd62"] Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.767821 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmhzk\" (UniqueName: \"kubernetes.io/projected/66ae0f3d-5137-4015-a7c3-f082f485f058-kube-api-access-mmhzk\") pod \"nova-api-5927-account-create-rtd62\" (UID: \"66ae0f3d-5137-4015-a7c3-f082f485f058\") " pod="openstack/nova-api-5927-account-create-rtd62" Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.806802 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d9b5-account-create-6bqbd"] Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.808481 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d9b5-account-create-6bqbd" Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.811828 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.818549 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d9b5-account-create-6bqbd"] Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.871245 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmhzk\" (UniqueName: \"kubernetes.io/projected/66ae0f3d-5137-4015-a7c3-f082f485f058-kube-api-access-mmhzk\") pod \"nova-api-5927-account-create-rtd62\" (UID: \"66ae0f3d-5137-4015-a7c3-f082f485f058\") " pod="openstack/nova-api-5927-account-create-rtd62" Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.911326 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmhzk\" (UniqueName: \"kubernetes.io/projected/66ae0f3d-5137-4015-a7c3-f082f485f058-kube-api-access-mmhzk\") pod \"nova-api-5927-account-create-rtd62\" (UID: \"66ae0f3d-5137-4015-a7c3-f082f485f058\") " pod="openstack/nova-api-5927-account-create-rtd62" Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.950485 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5927-account-create-rtd62" Oct 01 13:26:35 crc kubenswrapper[4749]: I1001 13:26:35.972667 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f88wf\" (UniqueName: \"kubernetes.io/projected/bfb1ec85-617c-4675-b366-68beb4f61f3a-kube-api-access-f88wf\") pod \"nova-cell0-d9b5-account-create-6bqbd\" (UID: \"bfb1ec85-617c-4675-b366-68beb4f61f3a\") " pod="openstack/nova-cell0-d9b5-account-create-6bqbd" Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.029698 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4752-account-create-zkt6l"] Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.034935 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4752-account-create-zkt6l" Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.037436 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.039603 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4752-account-create-zkt6l"] Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.075005 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f88wf\" (UniqueName: \"kubernetes.io/projected/bfb1ec85-617c-4675-b366-68beb4f61f3a-kube-api-access-f88wf\") pod \"nova-cell0-d9b5-account-create-6bqbd\" (UID: \"bfb1ec85-617c-4675-b366-68beb4f61f3a\") " pod="openstack/nova-cell0-d9b5-account-create-6bqbd" Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.094469 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f88wf\" (UniqueName: \"kubernetes.io/projected/bfb1ec85-617c-4675-b366-68beb4f61f3a-kube-api-access-f88wf\") pod \"nova-cell0-d9b5-account-create-6bqbd\" (UID: \"bfb1ec85-617c-4675-b366-68beb4f61f3a\") " pod="openstack/nova-cell0-d9b5-account-create-6bqbd" Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.129140 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d9b5-account-create-6bqbd" Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.180614 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgwcg\" (UniqueName: \"kubernetes.io/projected/688d40fd-4ccf-4228-9a52-e2bcdd3cf761-kube-api-access-vgwcg\") pod \"nova-cell1-4752-account-create-zkt6l\" (UID: \"688d40fd-4ccf-4228-9a52-e2bcdd3cf761\") " pod="openstack/nova-cell1-4752-account-create-zkt6l" Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.182894 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa","Type":"ContainerStarted","Data":"34ec5945d562f38cd09f1b004dee3c29d6cb97f675772d63cb90d719715c992b"} Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.182940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa","Type":"ContainerStarted","Data":"59b24e08e6a6a6cd38e7adc92b20fbd3b42c02a5af33dfc843522179ccf5b200"} Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.185084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bc78bfd1-472f-4d64-b48c-7b986bee129a","Type":"ContainerStarted","Data":"14509a5bb93ec274b797c5c4ac8d8ce57dbe7d52ef82069f57396fe2faaaf7c7"} Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.215835 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.215814417 podStartE2EDuration="4.215814417s" podCreationTimestamp="2025-10-01 13:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:36.207401638 +0000 UTC m=+1256.261386557" watchObservedRunningTime="2025-10-01 13:26:36.215814417 +0000 UTC m=+1256.269799326" Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.282199 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgwcg\" (UniqueName: \"kubernetes.io/projected/688d40fd-4ccf-4228-9a52-e2bcdd3cf761-kube-api-access-vgwcg\") pod \"nova-cell1-4752-account-create-zkt6l\" (UID: \"688d40fd-4ccf-4228-9a52-e2bcdd3cf761\") " pod="openstack/nova-cell1-4752-account-create-zkt6l" Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.307806 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgwcg\" (UniqueName: \"kubernetes.io/projected/688d40fd-4ccf-4228-9a52-e2bcdd3cf761-kube-api-access-vgwcg\") pod \"nova-cell1-4752-account-create-zkt6l\" (UID: \"688d40fd-4ccf-4228-9a52-e2bcdd3cf761\") " pod="openstack/nova-cell1-4752-account-create-zkt6l" Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.363841 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4752-account-create-zkt6l" Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.562391 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5927-account-create-rtd62"] Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.763754 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d9b5-account-create-6bqbd"] Oct 01 13:26:36 crc kubenswrapper[4749]: W1001 13:26:36.765656 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfb1ec85_617c_4675_b366_68beb4f61f3a.slice/crio-3c474c5c84015ce7955796b2263b1e5055950a2158b488b9b592f8f2e5e6906d WatchSource:0}: Error finding container 3c474c5c84015ce7955796b2263b1e5055950a2158b488b9b592f8f2e5e6906d: Status 404 returned error can't find the container with id 3c474c5c84015ce7955796b2263b1e5055950a2158b488b9b592f8f2e5e6906d Oct 01 13:26:36 crc kubenswrapper[4749]: W1001 13:26:36.924290 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod688d40fd_4ccf_4228_9a52_e2bcdd3cf761.slice/crio-21e9c670b323d10e30bc44e930e673b901a4aa2ba527dabbc1cf66504de9e474 WatchSource:0}: Error finding container 21e9c670b323d10e30bc44e930e673b901a4aa2ba527dabbc1cf66504de9e474: Status 404 returned error can't find the container with id 21e9c670b323d10e30bc44e930e673b901a4aa2ba527dabbc1cf66504de9e474 Oct 01 13:26:36 crc kubenswrapper[4749]: I1001 13:26:36.932547 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4752-account-create-zkt6l"] Oct 01 13:26:37 crc kubenswrapper[4749]: I1001 13:26:37.195477 4749 generic.go:334] "Generic (PLEG): container finished" podID="bfb1ec85-617c-4675-b366-68beb4f61f3a" containerID="23888105b7f6bee898bc2ccf61d14a10575290c1cc1ae4a8efda236744d121de" exitCode=0 Oct 01 13:26:37 crc kubenswrapper[4749]: I1001 13:26:37.195571 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d9b5-account-create-6bqbd" event={"ID":"bfb1ec85-617c-4675-b366-68beb4f61f3a","Type":"ContainerDied","Data":"23888105b7f6bee898bc2ccf61d14a10575290c1cc1ae4a8efda236744d121de"} Oct 01 13:26:37 crc kubenswrapper[4749]: I1001 13:26:37.195606 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d9b5-account-create-6bqbd" event={"ID":"bfb1ec85-617c-4675-b366-68beb4f61f3a","Type":"ContainerStarted","Data":"3c474c5c84015ce7955796b2263b1e5055950a2158b488b9b592f8f2e5e6906d"} Oct 01 13:26:37 crc kubenswrapper[4749]: I1001 13:26:37.202624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4752-account-create-zkt6l" event={"ID":"688d40fd-4ccf-4228-9a52-e2bcdd3cf761","Type":"ContainerStarted","Data":"475a3b76cce60952b57dc53079f6ec6a54634920be542a8e493dd4b52df36f0f"} Oct 01 13:26:37 crc kubenswrapper[4749]: I1001 13:26:37.202879 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4752-account-create-zkt6l" event={"ID":"688d40fd-4ccf-4228-9a52-e2bcdd3cf761","Type":"ContainerStarted","Data":"21e9c670b323d10e30bc44e930e673b901a4aa2ba527dabbc1cf66504de9e474"} Oct 01 13:26:37 crc kubenswrapper[4749]: I1001 13:26:37.204466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"495b97d9-1d27-4e6e-a857-ee6cfdf6dffa","Type":"ContainerStarted","Data":"7f07b9cdb1450fa5325c5fa361ebb741923688860ff9a70e5bf6489186cc5dd5"} Oct 01 13:26:37 crc kubenswrapper[4749]: I1001 13:26:37.208049 4749 generic.go:334] "Generic (PLEG): container finished" podID="66ae0f3d-5137-4015-a7c3-f082f485f058" containerID="8c773bad80c2b088d7018ffd8c47d352e36fe02527abfd1df6a15c019082d2ea" exitCode=0 Oct 01 13:26:37 crc kubenswrapper[4749]: I1001 13:26:37.208647 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5927-account-create-rtd62" event={"ID":"66ae0f3d-5137-4015-a7c3-f082f485f058","Type":"ContainerDied","Data":"8c773bad80c2b088d7018ffd8c47d352e36fe02527abfd1df6a15c019082d2ea"} Oct 01 13:26:37 crc kubenswrapper[4749]: I1001 13:26:37.208674 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5927-account-create-rtd62" event={"ID":"66ae0f3d-5137-4015-a7c3-f082f485f058","Type":"ContainerStarted","Data":"91e42681bd66a5cf7bd8596a84d5fcb4d7c4d53d6ef39d2e8bda6a688cb3a0a4"} Oct 01 13:26:37 crc kubenswrapper[4749]: I1001 13:26:37.227479 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-4752-account-create-zkt6l" podStartSLOduration=2.227461818 podStartE2EDuration="2.227461818s" podCreationTimestamp="2025-10-01 13:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:37.223978264 +0000 UTC m=+1257.277963163" watchObservedRunningTime="2025-10-01 13:26:37.227461818 +0000 UTC m=+1257.281446717" Oct 01 13:26:37 crc kubenswrapper[4749]: I1001 13:26:37.259035 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.259018072 podStartE2EDuration="3.259018072s" podCreationTimestamp="2025-10-01 13:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:37.251003725 +0000 UTC m=+1257.304988624" watchObservedRunningTime="2025-10-01 13:26:37.259018072 +0000 UTC m=+1257.313002971" Oct 01 13:26:38 crc kubenswrapper[4749]: I1001 13:26:38.226933 4749 generic.go:334] "Generic (PLEG): container finished" podID="d9ead71f-c58e-4634-96f9-81c9b165e24c" containerID="ec82d5bfc32297ac60d77fde66e8e92e7ae2c1287ead785bfa8106988027fb00" exitCode=0 Oct 01 13:26:38 crc kubenswrapper[4749]: I1001 13:26:38.227064 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tc29m" event={"ID":"d9ead71f-c58e-4634-96f9-81c9b165e24c","Type":"ContainerDied","Data":"ec82d5bfc32297ac60d77fde66e8e92e7ae2c1287ead785bfa8106988027fb00"} Oct 01 13:26:38 crc kubenswrapper[4749]: I1001 13:26:38.234434 4749 generic.go:334] "Generic (PLEG): container finished" podID="688d40fd-4ccf-4228-9a52-e2bcdd3cf761" containerID="475a3b76cce60952b57dc53079f6ec6a54634920be542a8e493dd4b52df36f0f" exitCode=0 Oct 01 13:26:38 crc kubenswrapper[4749]: I1001 13:26:38.234561 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4752-account-create-zkt6l" event={"ID":"688d40fd-4ccf-4228-9a52-e2bcdd3cf761","Type":"ContainerDied","Data":"475a3b76cce60952b57dc53079f6ec6a54634920be542a8e493dd4b52df36f0f"} Oct 01 13:26:38 crc kubenswrapper[4749]: I1001 13:26:38.786055 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5927-account-create-rtd62" Oct 01 13:26:38 crc kubenswrapper[4749]: I1001 13:26:38.792434 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d9b5-account-create-6bqbd" Oct 01 13:26:38 crc kubenswrapper[4749]: I1001 13:26:38.860787 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmhzk\" (UniqueName: \"kubernetes.io/projected/66ae0f3d-5137-4015-a7c3-f082f485f058-kube-api-access-mmhzk\") pod \"66ae0f3d-5137-4015-a7c3-f082f485f058\" (UID: \"66ae0f3d-5137-4015-a7c3-f082f485f058\") " Oct 01 13:26:38 crc kubenswrapper[4749]: I1001 13:26:38.860842 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f88wf\" (UniqueName: \"kubernetes.io/projected/bfb1ec85-617c-4675-b366-68beb4f61f3a-kube-api-access-f88wf\") pod \"bfb1ec85-617c-4675-b366-68beb4f61f3a\" (UID: \"bfb1ec85-617c-4675-b366-68beb4f61f3a\") " Oct 01 13:26:38 crc kubenswrapper[4749]: I1001 13:26:38.867204 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ae0f3d-5137-4015-a7c3-f082f485f058-kube-api-access-mmhzk" (OuterVolumeSpecName: "kube-api-access-mmhzk") pod "66ae0f3d-5137-4015-a7c3-f082f485f058" (UID: "66ae0f3d-5137-4015-a7c3-f082f485f058"). InnerVolumeSpecName "kube-api-access-mmhzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:38 crc kubenswrapper[4749]: I1001 13:26:38.867517 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb1ec85-617c-4675-b366-68beb4f61f3a-kube-api-access-f88wf" (OuterVolumeSpecName: "kube-api-access-f88wf") pod "bfb1ec85-617c-4675-b366-68beb4f61f3a" (UID: "bfb1ec85-617c-4675-b366-68beb4f61f3a"). InnerVolumeSpecName "kube-api-access-f88wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:38 crc kubenswrapper[4749]: I1001 13:26:38.962836 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmhzk\" (UniqueName: \"kubernetes.io/projected/66ae0f3d-5137-4015-a7c3-f082f485f058-kube-api-access-mmhzk\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:38 crc kubenswrapper[4749]: I1001 13:26:38.962876 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f88wf\" (UniqueName: \"kubernetes.io/projected/bfb1ec85-617c-4675-b366-68beb4f61f3a-kube-api-access-f88wf\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.244288 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5927-account-create-rtd62" event={"ID":"66ae0f3d-5137-4015-a7c3-f082f485f058","Type":"ContainerDied","Data":"91e42681bd66a5cf7bd8596a84d5fcb4d7c4d53d6ef39d2e8bda6a688cb3a0a4"} Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.244330 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5927-account-create-rtd62" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.244344 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91e42681bd66a5cf7bd8596a84d5fcb4d7c4d53d6ef39d2e8bda6a688cb3a0a4" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.246559 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d9b5-account-create-6bqbd" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.246801 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d9b5-account-create-6bqbd" event={"ID":"bfb1ec85-617c-4675-b366-68beb4f61f3a","Type":"ContainerDied","Data":"3c474c5c84015ce7955796b2263b1e5055950a2158b488b9b592f8f2e5e6906d"} Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.246832 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c474c5c84015ce7955796b2263b1e5055950a2158b488b9b592f8f2e5e6906d" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.617158 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tc29m" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.676683 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbx2b\" (UniqueName: \"kubernetes.io/projected/d9ead71f-c58e-4634-96f9-81c9b165e24c-kube-api-access-mbx2b\") pod \"d9ead71f-c58e-4634-96f9-81c9b165e24c\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.676752 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-combined-ca-bundle\") pod \"d9ead71f-c58e-4634-96f9-81c9b165e24c\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.676901 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9ead71f-c58e-4634-96f9-81c9b165e24c-etc-machine-id\") pod \"d9ead71f-c58e-4634-96f9-81c9b165e24c\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.676958 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-db-sync-config-data\") pod \"d9ead71f-c58e-4634-96f9-81c9b165e24c\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.676988 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-scripts\") pod \"d9ead71f-c58e-4634-96f9-81c9b165e24c\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.677019 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-config-data\") pod \"d9ead71f-c58e-4634-96f9-81c9b165e24c\" (UID: \"d9ead71f-c58e-4634-96f9-81c9b165e24c\") " Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.677426 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9ead71f-c58e-4634-96f9-81c9b165e24c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d9ead71f-c58e-4634-96f9-81c9b165e24c" (UID: "d9ead71f-c58e-4634-96f9-81c9b165e24c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.682103 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ead71f-c58e-4634-96f9-81c9b165e24c-kube-api-access-mbx2b" (OuterVolumeSpecName: "kube-api-access-mbx2b") pod "d9ead71f-c58e-4634-96f9-81c9b165e24c" (UID: "d9ead71f-c58e-4634-96f9-81c9b165e24c"). InnerVolumeSpecName "kube-api-access-mbx2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.682200 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d9ead71f-c58e-4634-96f9-81c9b165e24c" (UID: "d9ead71f-c58e-4634-96f9-81c9b165e24c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.682683 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-scripts" (OuterVolumeSpecName: "scripts") pod "d9ead71f-c58e-4634-96f9-81c9b165e24c" (UID: "d9ead71f-c58e-4634-96f9-81c9b165e24c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.716330 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9ead71f-c58e-4634-96f9-81c9b165e24c" (UID: "d9ead71f-c58e-4634-96f9-81c9b165e24c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.726478 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-config-data" (OuterVolumeSpecName: "config-data") pod "d9ead71f-c58e-4634-96f9-81c9b165e24c" (UID: "d9ead71f-c58e-4634-96f9-81c9b165e24c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.778528 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9ead71f-c58e-4634-96f9-81c9b165e24c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.778563 4749 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.778572 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.778581 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.778589 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbx2b\" (UniqueName: \"kubernetes.io/projected/d9ead71f-c58e-4634-96f9-81c9b165e24c-kube-api-access-mbx2b\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.778598 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ead71f-c58e-4634-96f9-81c9b165e24c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.812051 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4752-account-create-zkt6l" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.880381 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgwcg\" (UniqueName: \"kubernetes.io/projected/688d40fd-4ccf-4228-9a52-e2bcdd3cf761-kube-api-access-vgwcg\") pod \"688d40fd-4ccf-4228-9a52-e2bcdd3cf761\" (UID: \"688d40fd-4ccf-4228-9a52-e2bcdd3cf761\") " Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.893457 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688d40fd-4ccf-4228-9a52-e2bcdd3cf761-kube-api-access-vgwcg" (OuterVolumeSpecName: "kube-api-access-vgwcg") pod "688d40fd-4ccf-4228-9a52-e2bcdd3cf761" (UID: "688d40fd-4ccf-4228-9a52-e2bcdd3cf761"). InnerVolumeSpecName "kube-api-access-vgwcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.972641 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:39 crc kubenswrapper[4749]: I1001 13:26:39.982451 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgwcg\" (UniqueName: \"kubernetes.io/projected/688d40fd-4ccf-4228-9a52-e2bcdd3cf761-kube-api-access-vgwcg\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:39.999847 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.261166 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tc29m" event={"ID":"d9ead71f-c58e-4634-96f9-81c9b165e24c","Type":"ContainerDied","Data":"0e0920003169871c235816c60069d2bc2cf2e0d63df5edcf1c6c2b5c1184685a"} Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.261204 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e0920003169871c235816c60069d2bc2cf2e0d63df5edcf1c6c2b5c1184685a" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.261291 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tc29m" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.264991 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4752-account-create-zkt6l" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.264976 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4752-account-create-zkt6l" event={"ID":"688d40fd-4ccf-4228-9a52-e2bcdd3cf761","Type":"ContainerDied","Data":"21e9c670b323d10e30bc44e930e673b901a4aa2ba527dabbc1cf66504de9e474"} Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.265107 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21e9c670b323d10e30bc44e930e673b901a4aa2ba527dabbc1cf66504de9e474" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.265317 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.302925 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.545108 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:26:40 crc kubenswrapper[4749]: E1001 13:26:40.545526 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb1ec85-617c-4675-b366-68beb4f61f3a" containerName="mariadb-account-create" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.545541 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb1ec85-617c-4675-b366-68beb4f61f3a" containerName="mariadb-account-create" Oct 01 13:26:40 crc kubenswrapper[4749]: E1001 13:26:40.545565 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ae0f3d-5137-4015-a7c3-f082f485f058" containerName="mariadb-account-create" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.545572 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ae0f3d-5137-4015-a7c3-f082f485f058" containerName="mariadb-account-create" Oct 01 13:26:40 crc kubenswrapper[4749]: E1001 13:26:40.545589 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ead71f-c58e-4634-96f9-81c9b165e24c" containerName="cinder-db-sync" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.545595 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ead71f-c58e-4634-96f9-81c9b165e24c" containerName="cinder-db-sync" Oct 01 13:26:40 crc kubenswrapper[4749]: E1001 13:26:40.545604 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688d40fd-4ccf-4228-9a52-e2bcdd3cf761" containerName="mariadb-account-create" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.545610 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="688d40fd-4ccf-4228-9a52-e2bcdd3cf761" containerName="mariadb-account-create" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.545816 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="688d40fd-4ccf-4228-9a52-e2bcdd3cf761" containerName="mariadb-account-create" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.545842 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb1ec85-617c-4675-b366-68beb4f61f3a" containerName="mariadb-account-create" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.545851 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ae0f3d-5137-4015-a7c3-f082f485f058" containerName="mariadb-account-create" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.545862 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ead71f-c58e-4634-96f9-81c9b165e24c" containerName="cinder-db-sync" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.547027 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.557435 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.557653 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-w28m5" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.557981 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.558201 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.571293 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.684099 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cccb8cfc9-7lmlc"] Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.687012 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.699270 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cccb8cfc9-7lmlc"] Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.721377 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-scripts\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.721645 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5psr\" (UniqueName: \"kubernetes.io/projected/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-kube-api-access-t5psr\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.721747 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.721806 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-config-data\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.721908 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.721994 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.771246 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.772812 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.776523 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.790804 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.823686 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-dns-swift-storage-0\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.823743 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.823765 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-dns-svc\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.823782 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-config-data\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.823815 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-ovsdbserver-sb\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.823830 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-config\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.823849 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.823872 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-ovsdbserver-nb\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.823896 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp2p8\" (UniqueName: \"kubernetes.io/projected/0585bc19-be35-4666-8882-9f8332fd362d-kube-api-access-qp2p8\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.823921 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.823957 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-scripts\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.824017 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5psr\" (UniqueName: \"kubernetes.io/projected/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-kube-api-access-t5psr\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.824232 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.829133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.829529 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.838070 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-config-data\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.840186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5psr\" (UniqueName: \"kubernetes.io/projected/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-kube-api-access-t5psr\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.850801 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-scripts\") pod \"cinder-scheduler-0\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.877438 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.925497 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b463a6a1-a887-4e8c-a62d-88b39b076c64-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.925844 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b463a6a1-a887-4e8c-a62d-88b39b076c64-logs\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.925878 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-config-data-custom\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.925903 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-dns-swift-storage-0\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.925924 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-dns-svc\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.925956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-ovsdbserver-sb\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.925973 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-config\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.925995 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-ovsdbserver-nb\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.926013 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-scripts\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.926029 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxv6l\" (UniqueName: \"kubernetes.io/projected/b463a6a1-a887-4e8c-a62d-88b39b076c64-kube-api-access-gxv6l\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.926047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp2p8\" (UniqueName: \"kubernetes.io/projected/0585bc19-be35-4666-8882-9f8332fd362d-kube-api-access-qp2p8\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.926076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.926090 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-config-data\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.927061 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-ovsdbserver-sb\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.927135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-config\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.927375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-dns-svc\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.927783 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-dns-swift-storage-0\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.928083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-ovsdbserver-nb\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:40 crc kubenswrapper[4749]: I1001 13:26:40.943764 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp2p8\" (UniqueName: \"kubernetes.io/projected/0585bc19-be35-4666-8882-9f8332fd362d-kube-api-access-qp2p8\") pod \"dnsmasq-dns-6cccb8cfc9-7lmlc\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.005682 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.027836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b463a6a1-a887-4e8c-a62d-88b39b076c64-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.027914 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b463a6a1-a887-4e8c-a62d-88b39b076c64-logs\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.027955 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-config-data-custom\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.027950 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b463a6a1-a887-4e8c-a62d-88b39b076c64-etc-machine-id\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.028043 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-scripts\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.028061 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxv6l\" (UniqueName: \"kubernetes.io/projected/b463a6a1-a887-4e8c-a62d-88b39b076c64-kube-api-access-gxv6l\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.028088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.028101 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-config-data\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.028323 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b463a6a1-a887-4e8c-a62d-88b39b076c64-logs\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.032742 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-scripts\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.033782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-config-data-custom\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.034135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.037293 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-config-data\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.047591 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxv6l\" (UniqueName: \"kubernetes.io/projected/b463a6a1-a887-4e8c-a62d-88b39b076c64-kube-api-access-gxv6l\") pod \"cinder-api-0\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.088741 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.130951 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ts66r"] Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.132372 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.134183 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-fdsw2" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.136018 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.136160 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.148291 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ts66r"] Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.232157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-scripts\") pod \"nova-cell0-conductor-db-sync-ts66r\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.232463 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ts66r\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.232491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqdk7\" (UniqueName: \"kubernetes.io/projected/a63bfe18-de5a-44e8-abef-17ee0f2af92a-kube-api-access-wqdk7\") pod \"nova-cell0-conductor-db-sync-ts66r\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.232530 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-config-data\") pod \"nova-cell0-conductor-db-sync-ts66r\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.334842 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-scripts\") pod \"nova-cell0-conductor-db-sync-ts66r\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.334905 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ts66r\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.334926 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqdk7\" (UniqueName: \"kubernetes.io/projected/a63bfe18-de5a-44e8-abef-17ee0f2af92a-kube-api-access-wqdk7\") pod \"nova-cell0-conductor-db-sync-ts66r\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.334961 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-config-data\") pod \"nova-cell0-conductor-db-sync-ts66r\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.339522 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-config-data\") pod \"nova-cell0-conductor-db-sync-ts66r\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.341648 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-scripts\") pod \"nova-cell0-conductor-db-sync-ts66r\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.343926 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ts66r\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.356311 4749 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod24e52665-f55b-4137-82ec-7ab7392bca61"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod24e52665-f55b-4137-82ec-7ab7392bca61] : Timed out while waiting for systemd to remove kubepods-besteffort-pod24e52665_f55b_4137_82ec_7ab7392bca61.slice" Oct 01 13:26:41 crc kubenswrapper[4749]: E1001 13:26:41.356369 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod24e52665-f55b-4137-82ec-7ab7392bca61] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod24e52665-f55b-4137-82ec-7ab7392bca61] : Timed out while waiting for systemd to remove kubepods-besteffort-pod24e52665_f55b_4137_82ec_7ab7392bca61.slice" pod="openstack/ceilometer-0" podUID="24e52665-f55b-4137-82ec-7ab7392bca61" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.386839 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqdk7\" (UniqueName: \"kubernetes.io/projected/a63bfe18-de5a-44e8-abef-17ee0f2af92a-kube-api-access-wqdk7\") pod \"nova-cell0-conductor-db-sync-ts66r\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.419970 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:26:41 crc kubenswrapper[4749]: W1001 13:26:41.439349 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a25e1c1_3574_4c2c_b471_ed9e4614fd06.slice/crio-da09e43173409edcaf46ad2c8ce2d481023ba73746f023725072ee772e2e6d79 WatchSource:0}: Error finding container da09e43173409edcaf46ad2c8ce2d481023ba73746f023725072ee772e2e6d79: Status 404 returned error can't find the container with id da09e43173409edcaf46ad2c8ce2d481023ba73746f023725072ee772e2e6d79 Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.466251 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.471686 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.597113 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.631679 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cccb8cfc9-7lmlc"] Oct 01 13:26:41 crc kubenswrapper[4749]: I1001 13:26:41.989301 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ts66r"] Oct 01 13:26:42 crc kubenswrapper[4749]: W1001 13:26:42.000828 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda63bfe18_de5a_44e8_abef_17ee0f2af92a.slice/crio-a405f8074ddbe52f56cc7388b9917f032dc3c40fc4849e790f5f1e39da4fa748 WatchSource:0}: Error finding container a405f8074ddbe52f56cc7388b9917f032dc3c40fc4849e790f5f1e39da4fa748: Status 404 returned error can't find the container with id a405f8074ddbe52f56cc7388b9917f032dc3c40fc4849e790f5f1e39da4fa748 Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.344622 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a25e1c1-3574-4c2c-b471-ed9e4614fd06","Type":"ContainerStarted","Data":"da09e43173409edcaf46ad2c8ce2d481023ba73746f023725072ee772e2e6d79"} Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.359638 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ts66r" event={"ID":"a63bfe18-de5a-44e8-abef-17ee0f2af92a","Type":"ContainerStarted","Data":"a405f8074ddbe52f56cc7388b9917f032dc3c40fc4849e790f5f1e39da4fa748"} Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.368006 4749 generic.go:334] "Generic (PLEG): container finished" podID="0585bc19-be35-4666-8882-9f8332fd362d" containerID="1ad06e42f284f5cd11736595c78ac06024aad7c092b91f7f2c6c39b9ecba5933" exitCode=0 Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.368067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" event={"ID":"0585bc19-be35-4666-8882-9f8332fd362d","Type":"ContainerDied","Data":"1ad06e42f284f5cd11736595c78ac06024aad7c092b91f7f2c6c39b9ecba5933"} Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.368091 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" event={"ID":"0585bc19-be35-4666-8882-9f8332fd362d","Type":"ContainerStarted","Data":"8f928912266c1f8c468dae2c167462a53981b874c51d27eb6a090efe73a16fd1"} Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.372292 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.372369 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b463a6a1-a887-4e8c-a62d-88b39b076c64","Type":"ContainerStarted","Data":"e48c20dabad7a319ae0465ff057538ae568a3d33cb3f41519774839db48cc151"} Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.459650 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.470063 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.470337 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.470357 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.477789 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.481960 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.485455 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.486316 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.493974 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.520556 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.554094 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.573345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-config-data\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.575431 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hpqq\" (UniqueName: \"kubernetes.io/projected/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-kube-api-access-9hpqq\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.575481 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-log-httpd\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.575549 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.575582 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-run-httpd\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.576086 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-scripts\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.576134 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.677599 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-scripts\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.677643 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.677723 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-config-data\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.677784 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hpqq\" (UniqueName: \"kubernetes.io/projected/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-kube-api-access-9hpqq\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.677806 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-log-httpd\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.677850 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.677874 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-run-httpd\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.678241 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-run-httpd\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.678428 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-log-httpd\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.684953 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-config-data\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.685375 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.685738 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-scripts\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.696754 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.708882 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hpqq\" (UniqueName: \"kubernetes.io/projected/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-kube-api-access-9hpqq\") pod \"ceilometer-0\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " pod="openstack/ceilometer-0" Oct 01 13:26:42 crc kubenswrapper[4749]: I1001 13:26:42.845042 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:26:43 crc kubenswrapper[4749]: I1001 13:26:43.253958 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24e52665-f55b-4137-82ec-7ab7392bca61" path="/var/lib/kubelet/pods/24e52665-f55b-4137-82ec-7ab7392bca61/volumes" Oct 01 13:26:43 crc kubenswrapper[4749]: I1001 13:26:43.318868 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:26:43 crc kubenswrapper[4749]: I1001 13:26:43.379707 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:26:43 crc kubenswrapper[4749]: I1001 13:26:43.389343 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a25e1c1-3574-4c2c-b471-ed9e4614fd06","Type":"ContainerStarted","Data":"e868b2b2be00db01d900882983309c3cf3a3c2615b18db834d761bf3b0b242a2"} Oct 01 13:26:43 crc kubenswrapper[4749]: I1001 13:26:43.400056 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" event={"ID":"0585bc19-be35-4666-8882-9f8332fd362d","Type":"ContainerStarted","Data":"362dfa8dcb380fc0c25a0fc3ec487416da72f8450c39a2e3b46abbc8f25f9aac"} Oct 01 13:26:43 crc kubenswrapper[4749]: I1001 13:26:43.400611 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:43 crc kubenswrapper[4749]: W1001 13:26:43.409348 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f73fdbc_2d5f_4770_856b_a2a180bfd6c5.slice/crio-e313525b9305daf3015b829dff530c9016b62be57528e6d937698595a19208fe WatchSource:0}: Error finding container e313525b9305daf3015b829dff530c9016b62be57528e6d937698595a19208fe: Status 404 returned error can't find the container with id e313525b9305daf3015b829dff530c9016b62be57528e6d937698595a19208fe Oct 01 13:26:43 crc kubenswrapper[4749]: I1001 13:26:43.413404 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b463a6a1-a887-4e8c-a62d-88b39b076c64","Type":"ContainerStarted","Data":"6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56"} Oct 01 13:26:43 crc kubenswrapper[4749]: I1001 13:26:43.413459 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 13:26:43 crc kubenswrapper[4749]: I1001 13:26:43.413474 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 13:26:43 crc kubenswrapper[4749]: I1001 13:26:43.421119 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" podStartSLOduration=3.421098959 podStartE2EDuration="3.421098959s" podCreationTimestamp="2025-10-01 13:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:43.419602285 +0000 UTC m=+1263.473587184" watchObservedRunningTime="2025-10-01 13:26:43.421098959 +0000 UTC m=+1263.475083858" Oct 01 13:26:44 crc kubenswrapper[4749]: I1001 13:26:44.422831 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5","Type":"ContainerStarted","Data":"d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2"} Oct 01 13:26:44 crc kubenswrapper[4749]: I1001 13:26:44.423290 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5","Type":"ContainerStarted","Data":"e313525b9305daf3015b829dff530c9016b62be57528e6d937698595a19208fe"} Oct 01 13:26:44 crc kubenswrapper[4749]: I1001 13:26:44.424997 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a25e1c1-3574-4c2c-b471-ed9e4614fd06","Type":"ContainerStarted","Data":"602bde3846acfcbf2ea9e335da579153206a8607a608b0b10213243bd5d3f32b"} Oct 01 13:26:44 crc kubenswrapper[4749]: I1001 13:26:44.427706 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b463a6a1-a887-4e8c-a62d-88b39b076c64","Type":"ContainerStarted","Data":"79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397"} Oct 01 13:26:44 crc kubenswrapper[4749]: I1001 13:26:44.428080 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b463a6a1-a887-4e8c-a62d-88b39b076c64" containerName="cinder-api-log" containerID="cri-o://6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56" gracePeriod=30 Oct 01 13:26:44 crc kubenswrapper[4749]: I1001 13:26:44.428168 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="b463a6a1-a887-4e8c-a62d-88b39b076c64" containerName="cinder-api" containerID="cri-o://79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397" gracePeriod=30 Oct 01 13:26:44 crc kubenswrapper[4749]: I1001 13:26:44.452447 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.142273723 podStartE2EDuration="4.452424862s" podCreationTimestamp="2025-10-01 13:26:40 +0000 UTC" firstStartedPulling="2025-10-01 13:26:41.471462029 +0000 UTC m=+1261.525446928" lastFinishedPulling="2025-10-01 13:26:41.781613168 +0000 UTC m=+1261.835598067" observedRunningTime="2025-10-01 13:26:44.439332195 +0000 UTC m=+1264.493317104" watchObservedRunningTime="2025-10-01 13:26:44.452424862 +0000 UTC m=+1264.506409761" Oct 01 13:26:44 crc kubenswrapper[4749]: I1001 13:26:44.466195 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.46617355 podStartE2EDuration="4.46617355s" podCreationTimestamp="2025-10-01 13:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:44.457280476 +0000 UTC m=+1264.511265375" watchObservedRunningTime="2025-10-01 13:26:44.46617355 +0000 UTC m=+1264.520158449" Oct 01 13:26:44 crc kubenswrapper[4749]: I1001 13:26:44.594029 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:44 crc kubenswrapper[4749]: I1001 13:26:44.594066 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:44 crc kubenswrapper[4749]: I1001 13:26:44.651047 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:44 crc kubenswrapper[4749]: I1001 13:26:44.654527 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.207699 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.329585 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxv6l\" (UniqueName: \"kubernetes.io/projected/b463a6a1-a887-4e8c-a62d-88b39b076c64-kube-api-access-gxv6l\") pod \"b463a6a1-a887-4e8c-a62d-88b39b076c64\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.329663 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-scripts\") pod \"b463a6a1-a887-4e8c-a62d-88b39b076c64\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.329881 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b463a6a1-a887-4e8c-a62d-88b39b076c64-logs\") pod \"b463a6a1-a887-4e8c-a62d-88b39b076c64\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.329926 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b463a6a1-a887-4e8c-a62d-88b39b076c64-etc-machine-id\") pod \"b463a6a1-a887-4e8c-a62d-88b39b076c64\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.329957 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-config-data\") pod \"b463a6a1-a887-4e8c-a62d-88b39b076c64\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.330061 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-config-data-custom\") pod \"b463a6a1-a887-4e8c-a62d-88b39b076c64\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.330085 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-combined-ca-bundle\") pod \"b463a6a1-a887-4e8c-a62d-88b39b076c64\" (UID: \"b463a6a1-a887-4e8c-a62d-88b39b076c64\") " Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.330155 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b463a6a1-a887-4e8c-a62d-88b39b076c64-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b463a6a1-a887-4e8c-a62d-88b39b076c64" (UID: "b463a6a1-a887-4e8c-a62d-88b39b076c64"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.330432 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b463a6a1-a887-4e8c-a62d-88b39b076c64-logs" (OuterVolumeSpecName: "logs") pod "b463a6a1-a887-4e8c-a62d-88b39b076c64" (UID: "b463a6a1-a887-4e8c-a62d-88b39b076c64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.330475 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b463a6a1-a887-4e8c-a62d-88b39b076c64-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.345386 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b463a6a1-a887-4e8c-a62d-88b39b076c64" (UID: "b463a6a1-a887-4e8c-a62d-88b39b076c64"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.346153 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-scripts" (OuterVolumeSpecName: "scripts") pod "b463a6a1-a887-4e8c-a62d-88b39b076c64" (UID: "b463a6a1-a887-4e8c-a62d-88b39b076c64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.351948 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b463a6a1-a887-4e8c-a62d-88b39b076c64-kube-api-access-gxv6l" (OuterVolumeSpecName: "kube-api-access-gxv6l") pod "b463a6a1-a887-4e8c-a62d-88b39b076c64" (UID: "b463a6a1-a887-4e8c-a62d-88b39b076c64"). InnerVolumeSpecName "kube-api-access-gxv6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.410584 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-config-data" (OuterVolumeSpecName: "config-data") pod "b463a6a1-a887-4e8c-a62d-88b39b076c64" (UID: "b463a6a1-a887-4e8c-a62d-88b39b076c64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.420341 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b463a6a1-a887-4e8c-a62d-88b39b076c64" (UID: "b463a6a1-a887-4e8c-a62d-88b39b076c64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.432506 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b463a6a1-a887-4e8c-a62d-88b39b076c64-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.432545 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.432560 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.432577 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.432589 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxv6l\" (UniqueName: \"kubernetes.io/projected/b463a6a1-a887-4e8c-a62d-88b39b076c64-kube-api-access-gxv6l\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.432601 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b463a6a1-a887-4e8c-a62d-88b39b076c64-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.443367 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5","Type":"ContainerStarted","Data":"f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a"} Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.443419 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5","Type":"ContainerStarted","Data":"26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f"} Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.445706 4749 generic.go:334] "Generic (PLEG): container finished" podID="b463a6a1-a887-4e8c-a62d-88b39b076c64" containerID="79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397" exitCode=0 Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.445726 4749 generic.go:334] "Generic (PLEG): container finished" podID="b463a6a1-a887-4e8c-a62d-88b39b076c64" containerID="6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56" exitCode=143 Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.446109 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.446134 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.447041 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.447319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b463a6a1-a887-4e8c-a62d-88b39b076c64","Type":"ContainerDied","Data":"79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397"} Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.447378 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.447394 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.447403 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b463a6a1-a887-4e8c-a62d-88b39b076c64","Type":"ContainerDied","Data":"6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56"} Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.447419 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"b463a6a1-a887-4e8c-a62d-88b39b076c64","Type":"ContainerDied","Data":"e48c20dabad7a319ae0465ff057538ae568a3d33cb3f41519774839db48cc151"} Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.447436 4749 scope.go:117] "RemoveContainer" containerID="79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.494743 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.495370 4749 scope.go:117] "RemoveContainer" containerID="6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.507688 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.520318 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:26:45 crc kubenswrapper[4749]: E1001 13:26:45.520736 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b463a6a1-a887-4e8c-a62d-88b39b076c64" containerName="cinder-api" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.520752 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b463a6a1-a887-4e8c-a62d-88b39b076c64" containerName="cinder-api" Oct 01 13:26:45 crc kubenswrapper[4749]: E1001 13:26:45.520779 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b463a6a1-a887-4e8c-a62d-88b39b076c64" containerName="cinder-api-log" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.520785 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b463a6a1-a887-4e8c-a62d-88b39b076c64" containerName="cinder-api-log" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.520975 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b463a6a1-a887-4e8c-a62d-88b39b076c64" containerName="cinder-api-log" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.521003 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b463a6a1-a887-4e8c-a62d-88b39b076c64" containerName="cinder-api" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.522008 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.525629 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.526177 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.526204 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.531675 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.557584 4749 scope.go:117] "RemoveContainer" containerID="79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397" Oct 01 13:26:45 crc kubenswrapper[4749]: E1001 13:26:45.559152 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397\": container with ID starting with 79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397 not found: ID does not exist" containerID="79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.559195 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397"} err="failed to get container status \"79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397\": rpc error: code = NotFound desc = could not find container \"79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397\": container with ID starting with 79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397 not found: ID does not exist" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.559232 4749 scope.go:117] "RemoveContainer" containerID="6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56" Oct 01 13:26:45 crc kubenswrapper[4749]: E1001 13:26:45.559449 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56\": container with ID starting with 6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56 not found: ID does not exist" containerID="6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.559466 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56"} err="failed to get container status \"6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56\": rpc error: code = NotFound desc = could not find container \"6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56\": container with ID starting with 6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56 not found: ID does not exist" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.559478 4749 scope.go:117] "RemoveContainer" containerID="79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.559621 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397"} err="failed to get container status \"79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397\": rpc error: code = NotFound desc = could not find container \"79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397\": container with ID starting with 79f74e415d0673a4a1bfa992740e11dc18774f4a0a6bacc010363d0ff6d9c397 not found: ID does not exist" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.559635 4749 scope.go:117] "RemoveContainer" containerID="6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.559771 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56"} err="failed to get container status \"6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56\": rpc error: code = NotFound desc = could not find container \"6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56\": container with ID starting with 6a768b5ea8707fb8f46d746a6e15dddd58d827ebf36c22d7d3e23efa81604c56 not found: ID does not exist" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.635434 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-config-data-custom\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.635692 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-scripts\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.635786 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.635912 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468beb78-1358-4a1b-ad2c-3941f3f270c6-logs\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.635992 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.636081 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzx8b\" (UniqueName: \"kubernetes.io/projected/468beb78-1358-4a1b-ad2c-3941f3f270c6-kube-api-access-fzx8b\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.636206 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/468beb78-1358-4a1b-ad2c-3941f3f270c6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.636294 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-config-data\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.636387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.737906 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-config-data\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.737960 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.737993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-config-data-custom\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.738060 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-scripts\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.739054 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.739103 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468beb78-1358-4a1b-ad2c-3941f3f270c6-logs\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.739126 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.739161 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzx8b\" (UniqueName: \"kubernetes.io/projected/468beb78-1358-4a1b-ad2c-3941f3f270c6-kube-api-access-fzx8b\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.739226 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/468beb78-1358-4a1b-ad2c-3941f3f270c6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.739292 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/468beb78-1358-4a1b-ad2c-3941f3f270c6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.739815 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/468beb78-1358-4a1b-ad2c-3941f3f270c6-logs\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.743083 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-config-data-custom\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.743776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.744272 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-config-data\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.745447 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.746610 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-scripts\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.747675 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468beb78-1358-4a1b-ad2c-3941f3f270c6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.757677 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzx8b\" (UniqueName: \"kubernetes.io/projected/468beb78-1358-4a1b-ad2c-3941f3f270c6-kube-api-access-fzx8b\") pod \"cinder-api-0\" (UID: \"468beb78-1358-4a1b-ad2c-3941f3f270c6\") " pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.849729 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 13:26:45 crc kubenswrapper[4749]: I1001 13:26:45.877867 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 13:26:46 crc kubenswrapper[4749]: I1001 13:26:46.359620 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 13:26:46 crc kubenswrapper[4749]: I1001 13:26:46.461022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"468beb78-1358-4a1b-ad2c-3941f3f270c6","Type":"ContainerStarted","Data":"0a0a1d801108ff8845765f76c5f822958d90c55a3a19b29eaf7e08c8e5a6bfab"} Oct 01 13:26:47 crc kubenswrapper[4749]: I1001 13:26:47.246557 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b463a6a1-a887-4e8c-a62d-88b39b076c64" path="/var/lib/kubelet/pods/b463a6a1-a887-4e8c-a62d-88b39b076c64/volumes" Oct 01 13:26:47 crc kubenswrapper[4749]: I1001 13:26:47.479762 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:26:47 crc kubenswrapper[4749]: I1001 13:26:47.479795 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:26:47 crc kubenswrapper[4749]: I1001 13:26:47.479758 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"468beb78-1358-4a1b-ad2c-3941f3f270c6","Type":"ContainerStarted","Data":"d9a9ca4feee57f292e6baf343b999bddddc7acb8c6b2b505f05c746ac5407bcd"} Oct 01 13:26:47 crc kubenswrapper[4749]: I1001 13:26:47.580418 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 13:26:47 crc kubenswrapper[4749]: I1001 13:26:47.580736 4749 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 13:26:47 crc kubenswrapper[4749]: I1001 13:26:47.582722 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 13:26:47 crc kubenswrapper[4749]: I1001 13:26:47.769748 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:47 crc kubenswrapper[4749]: I1001 13:26:47.887945 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 13:26:48 crc kubenswrapper[4749]: I1001 13:26:48.496855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"468beb78-1358-4a1b-ad2c-3941f3f270c6","Type":"ContainerStarted","Data":"18357da2210d4ca885b3b9fce7acc9b91eb8b6ac7202a29ea40d1751fb03304a"} Oct 01 13:26:49 crc kubenswrapper[4749]: I1001 13:26:49.511502 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 13:26:51 crc kubenswrapper[4749]: I1001 13:26:51.008709 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:26:51 crc kubenswrapper[4749]: I1001 13:26:51.043734 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.043716134 podStartE2EDuration="6.043716134s" podCreationTimestamp="2025-10-01 13:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:26:48.531722965 +0000 UTC m=+1268.585707864" watchObservedRunningTime="2025-10-01 13:26:51.043716134 +0000 UTC m=+1271.097701033" Oct 01 13:26:51 crc kubenswrapper[4749]: I1001 13:26:51.080896 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68c474b9fc-mn4ts"] Oct 01 13:26:51 crc kubenswrapper[4749]: I1001 13:26:51.081714 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" podUID="b34ccf83-0af2-438a-ad7f-c79c6886db75" containerName="dnsmasq-dns" containerID="cri-o://496383534a049afa7139823bfc265a218724711b434d4209d1f1f68e8eb07326" gracePeriod=10 Oct 01 13:26:51 crc kubenswrapper[4749]: I1001 13:26:51.362961 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 13:26:51 crc kubenswrapper[4749]: I1001 13:26:51.434750 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:26:51 crc kubenswrapper[4749]: I1001 13:26:51.536689 4749 generic.go:334] "Generic (PLEG): container finished" podID="b34ccf83-0af2-438a-ad7f-c79c6886db75" containerID="496383534a049afa7139823bfc265a218724711b434d4209d1f1f68e8eb07326" exitCode=0 Oct 01 13:26:51 crc kubenswrapper[4749]: I1001 13:26:51.536773 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" event={"ID":"b34ccf83-0af2-438a-ad7f-c79c6886db75","Type":"ContainerDied","Data":"496383534a049afa7139823bfc265a218724711b434d4209d1f1f68e8eb07326"} Oct 01 13:26:51 crc kubenswrapper[4749]: I1001 13:26:51.536904 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0a25e1c1-3574-4c2c-b471-ed9e4614fd06" containerName="cinder-scheduler" containerID="cri-o://e868b2b2be00db01d900882983309c3cf3a3c2615b18db834d761bf3b0b242a2" gracePeriod=30 Oct 01 13:26:51 crc kubenswrapper[4749]: I1001 13:26:51.536974 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0a25e1c1-3574-4c2c-b471-ed9e4614fd06" containerName="probe" containerID="cri-o://602bde3846acfcbf2ea9e335da579153206a8607a608b0b10213243bd5d3f32b" gracePeriod=30 Oct 01 13:26:54 crc kubenswrapper[4749]: I1001 13:26:54.576335 4749 generic.go:334] "Generic (PLEG): container finished" podID="0a25e1c1-3574-4c2c-b471-ed9e4614fd06" containerID="602bde3846acfcbf2ea9e335da579153206a8607a608b0b10213243bd5d3f32b" exitCode=0 Oct 01 13:26:54 crc kubenswrapper[4749]: I1001 13:26:54.576379 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a25e1c1-3574-4c2c-b471-ed9e4614fd06","Type":"ContainerDied","Data":"602bde3846acfcbf2ea9e335da579153206a8607a608b0b10213243bd5d3f32b"} Oct 01 13:26:55 crc kubenswrapper[4749]: I1001 13:26:55.132551 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" podUID="b34ccf83-0af2-438a-ad7f-c79c6886db75" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.177:5353: connect: connection refused" Oct 01 13:26:59 crc kubenswrapper[4749]: I1001 13:26:59.861399 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="468beb78-1358-4a1b-ad2c-3941f3f270c6" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.198:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.538774 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.669161 4749 generic.go:334] "Generic (PLEG): container finished" podID="0a25e1c1-3574-4c2c-b471-ed9e4614fd06" containerID="e868b2b2be00db01d900882983309c3cf3a3c2615b18db834d761bf3b0b242a2" exitCode=0 Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.669280 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a25e1c1-3574-4c2c-b471-ed9e4614fd06","Type":"ContainerDied","Data":"e868b2b2be00db01d900882983309c3cf3a3c2615b18db834d761bf3b0b242a2"} Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.672009 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" event={"ID":"b34ccf83-0af2-438a-ad7f-c79c6886db75","Type":"ContainerDied","Data":"b3bf5fe9336ad6bb370c8ab6acdfb3ece14b7b902ce4ba7bddf4c5f3c6bf00a0"} Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.672051 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.672076 4749 scope.go:117] "RemoveContainer" containerID="496383534a049afa7139823bfc265a218724711b434d4209d1f1f68e8eb07326" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.677576 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-ovsdbserver-nb\") pod \"b34ccf83-0af2-438a-ad7f-c79c6886db75\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.678374 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-ovsdbserver-sb\") pod \"b34ccf83-0af2-438a-ad7f-c79c6886db75\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.678445 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbmq7\" (UniqueName: \"kubernetes.io/projected/b34ccf83-0af2-438a-ad7f-c79c6886db75-kube-api-access-hbmq7\") pod \"b34ccf83-0af2-438a-ad7f-c79c6886db75\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.678603 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-dns-swift-storage-0\") pod \"b34ccf83-0af2-438a-ad7f-c79c6886db75\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.678671 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-dns-svc\") pod \"b34ccf83-0af2-438a-ad7f-c79c6886db75\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.678692 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-config\") pod \"b34ccf83-0af2-438a-ad7f-c79c6886db75\" (UID: \"b34ccf83-0af2-438a-ad7f-c79c6886db75\") " Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.688309 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34ccf83-0af2-438a-ad7f-c79c6886db75-kube-api-access-hbmq7" (OuterVolumeSpecName: "kube-api-access-hbmq7") pod "b34ccf83-0af2-438a-ad7f-c79c6886db75" (UID: "b34ccf83-0af2-438a-ad7f-c79c6886db75"). InnerVolumeSpecName "kube-api-access-hbmq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.704778 4749 scope.go:117] "RemoveContainer" containerID="958fc4615311f4201bc99f778be02331e193477546d3eeeb867686d1ad7e430c" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.724444 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b34ccf83-0af2-438a-ad7f-c79c6886db75" (UID: "b34ccf83-0af2-438a-ad7f-c79c6886db75"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.734288 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b34ccf83-0af2-438a-ad7f-c79c6886db75" (UID: "b34ccf83-0af2-438a-ad7f-c79c6886db75"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.740327 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b34ccf83-0af2-438a-ad7f-c79c6886db75" (UID: "b34ccf83-0af2-438a-ad7f-c79c6886db75"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.746303 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-config" (OuterVolumeSpecName: "config") pod "b34ccf83-0af2-438a-ad7f-c79c6886db75" (UID: "b34ccf83-0af2-438a-ad7f-c79c6886db75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.751315 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b34ccf83-0af2-438a-ad7f-c79c6886db75" (UID: "b34ccf83-0af2-438a-ad7f-c79c6886db75"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.781918 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.781952 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.781968 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbmq7\" (UniqueName: \"kubernetes.io/projected/b34ccf83-0af2-438a-ad7f-c79c6886db75-kube-api-access-hbmq7\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.781983 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.781995 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.782007 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ccf83-0af2-438a-ad7f-c79c6886db75-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:00 crc kubenswrapper[4749]: I1001 13:27:00.854414 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="468beb78-1358-4a1b-ad2c-3941f3f270c6" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.198:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:27:01 crc kubenswrapper[4749]: I1001 13:27:01.014113 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68c474b9fc-mn4ts"] Oct 01 13:27:01 crc kubenswrapper[4749]: I1001 13:27:01.025784 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68c474b9fc-mn4ts"] Oct 01 13:27:01 crc kubenswrapper[4749]: I1001 13:27:01.247376 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b34ccf83-0af2-438a-ad7f-c79c6886db75" path="/var/lib/kubelet/pods/b34ccf83-0af2-438a-ad7f-c79c6886db75/volumes" Oct 01 13:27:01 crc kubenswrapper[4749]: I1001 13:27:01.823909 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.002388 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-config-data\") pod \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.002602 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5psr\" (UniqueName: \"kubernetes.io/projected/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-kube-api-access-t5psr\") pod \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.002653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-etc-machine-id\") pod \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.002682 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-combined-ca-bundle\") pod \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.002718 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-scripts\") pod \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.002787 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0a25e1c1-3574-4c2c-b471-ed9e4614fd06" (UID: "0a25e1c1-3574-4c2c-b471-ed9e4614fd06"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.002832 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-config-data-custom\") pod \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\" (UID: \"0a25e1c1-3574-4c2c-b471-ed9e4614fd06\") " Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.003416 4749 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.011592 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-scripts" (OuterVolumeSpecName: "scripts") pod "0a25e1c1-3574-4c2c-b471-ed9e4614fd06" (UID: "0a25e1c1-3574-4c2c-b471-ed9e4614fd06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.013735 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-kube-api-access-t5psr" (OuterVolumeSpecName: "kube-api-access-t5psr") pod "0a25e1c1-3574-4c2c-b471-ed9e4614fd06" (UID: "0a25e1c1-3574-4c2c-b471-ed9e4614fd06"). InnerVolumeSpecName "kube-api-access-t5psr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.015472 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0a25e1c1-3574-4c2c-b471-ed9e4614fd06" (UID: "0a25e1c1-3574-4c2c-b471-ed9e4614fd06"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.067449 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a25e1c1-3574-4c2c-b471-ed9e4614fd06" (UID: "0a25e1c1-3574-4c2c-b471-ed9e4614fd06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.105919 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.105973 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.106201 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5psr\" (UniqueName: \"kubernetes.io/projected/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-kube-api-access-t5psr\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.106251 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.106264 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.106276 4749 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.115477 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-config-data" (OuterVolumeSpecName: "config-data") pod "0a25e1c1-3574-4c2c-b471-ed9e4614fd06" (UID: "0a25e1c1-3574-4c2c-b471-ed9e4614fd06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.208397 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a25e1c1-3574-4c2c-b471-ed9e4614fd06-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:02 crc kubenswrapper[4749]: E1001 13:27:02.605208 4749 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-nova-conductor:watcher_latest" Oct 01 13:27:02 crc kubenswrapper[4749]: E1001 13:27:02.605309 4749 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-nova-conductor:watcher_latest" Oct 01 13:27:02 crc kubenswrapper[4749]: E1001 13:27:02.605494 4749 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:38.102.83.30:5001/podified-master-centos10/openstack-nova-conductor:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wqdk7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-ts66r_openstack(a63bfe18-de5a-44e8-abef-17ee0f2af92a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 13:27:02 crc kubenswrapper[4749]: E1001 13:27:02.606764 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-ts66r" podUID="a63bfe18-de5a-44e8-abef-17ee0f2af92a" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.695480 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.695489 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a25e1c1-3574-4c2c-b471-ed9e4614fd06","Type":"ContainerDied","Data":"da09e43173409edcaf46ad2c8ce2d481023ba73746f023725072ee772e2e6d79"} Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.695549 4749 scope.go:117] "RemoveContainer" containerID="602bde3846acfcbf2ea9e335da579153206a8607a608b0b10213243bd5d3f32b" Oct 01 13:27:02 crc kubenswrapper[4749]: E1001 13:27:02.697534 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.30:5001/podified-master-centos10/openstack-nova-conductor:watcher_latest\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-ts66r" podUID="a63bfe18-de5a-44e8-abef-17ee0f2af92a" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.719924 4749 scope.go:117] "RemoveContainer" containerID="e868b2b2be00db01d900882983309c3cf3a3c2615b18db834d761bf3b0b242a2" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.749313 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.760123 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.788774 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:27:02 crc kubenswrapper[4749]: E1001 13:27:02.789740 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a25e1c1-3574-4c2c-b471-ed9e4614fd06" containerName="cinder-scheduler" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.789773 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a25e1c1-3574-4c2c-b471-ed9e4614fd06" containerName="cinder-scheduler" Oct 01 13:27:02 crc kubenswrapper[4749]: E1001 13:27:02.789805 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34ccf83-0af2-438a-ad7f-c79c6886db75" containerName="dnsmasq-dns" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.789818 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34ccf83-0af2-438a-ad7f-c79c6886db75" containerName="dnsmasq-dns" Oct 01 13:27:02 crc kubenswrapper[4749]: E1001 13:27:02.790409 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34ccf83-0af2-438a-ad7f-c79c6886db75" containerName="init" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.790439 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34ccf83-0af2-438a-ad7f-c79c6886db75" containerName="init" Oct 01 13:27:02 crc kubenswrapper[4749]: E1001 13:27:02.790493 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a25e1c1-3574-4c2c-b471-ed9e4614fd06" containerName="probe" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.790506 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a25e1c1-3574-4c2c-b471-ed9e4614fd06" containerName="probe" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.790821 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b34ccf83-0af2-438a-ad7f-c79c6886db75" containerName="dnsmasq-dns" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.790869 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a25e1c1-3574-4c2c-b471-ed9e4614fd06" containerName="probe" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.790898 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a25e1c1-3574-4c2c-b471-ed9e4614fd06" containerName="cinder-scheduler" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.792632 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.820853 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.823071 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/005bd9d5-4799-4763-aa6b-46a9341c36d2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.823116 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/005bd9d5-4799-4763-aa6b-46a9341c36d2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.823157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/005bd9d5-4799-4763-aa6b-46a9341c36d2-scripts\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.823301 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/005bd9d5-4799-4763-aa6b-46a9341c36d2-config-data\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.823339 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005bd9d5-4799-4763-aa6b-46a9341c36d2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.823406 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67q8d\" (UniqueName: \"kubernetes.io/projected/005bd9d5-4799-4763-aa6b-46a9341c36d2-kube-api-access-67q8d\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.833376 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.928349 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/005bd9d5-4799-4763-aa6b-46a9341c36d2-config-data\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.928403 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005bd9d5-4799-4763-aa6b-46a9341c36d2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.928435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67q8d\" (UniqueName: \"kubernetes.io/projected/005bd9d5-4799-4763-aa6b-46a9341c36d2-kube-api-access-67q8d\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.929019 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/005bd9d5-4799-4763-aa6b-46a9341c36d2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.929078 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/005bd9d5-4799-4763-aa6b-46a9341c36d2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.929129 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/005bd9d5-4799-4763-aa6b-46a9341c36d2-scripts\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.934179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/005bd9d5-4799-4763-aa6b-46a9341c36d2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.934895 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/005bd9d5-4799-4763-aa6b-46a9341c36d2-scripts\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.937192 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/005bd9d5-4799-4763-aa6b-46a9341c36d2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.950650 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/005bd9d5-4799-4763-aa6b-46a9341c36d2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.953351 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/005bd9d5-4799-4763-aa6b-46a9341c36d2-config-data\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:02 crc kubenswrapper[4749]: I1001 13:27:02.955775 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67q8d\" (UniqueName: \"kubernetes.io/projected/005bd9d5-4799-4763-aa6b-46a9341c36d2-kube-api-access-67q8d\") pod \"cinder-scheduler-0\" (UID: \"005bd9d5-4799-4763-aa6b-46a9341c36d2\") " pod="openstack/cinder-scheduler-0" Oct 01 13:27:03 crc kubenswrapper[4749]: I1001 13:27:03.144439 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 13:27:03 crc kubenswrapper[4749]: I1001 13:27:03.246606 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a25e1c1-3574-4c2c-b471-ed9e4614fd06" path="/var/lib/kubelet/pods/0a25e1c1-3574-4c2c-b471-ed9e4614fd06/volumes" Oct 01 13:27:03 crc kubenswrapper[4749]: I1001 13:27:03.651051 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 13:27:03 crc kubenswrapper[4749]: W1001 13:27:03.653786 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod005bd9d5_4799_4763_aa6b_46a9341c36d2.slice/crio-eb7e5e97c66630d1db36ce63ff76a14198797993bb8170b2d07cccb45992431c WatchSource:0}: Error finding container eb7e5e97c66630d1db36ce63ff76a14198797993bb8170b2d07cccb45992431c: Status 404 returned error can't find the container with id eb7e5e97c66630d1db36ce63ff76a14198797993bb8170b2d07cccb45992431c Oct 01 13:27:03 crc kubenswrapper[4749]: I1001 13:27:03.707760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"005bd9d5-4799-4763-aa6b-46a9341c36d2","Type":"ContainerStarted","Data":"eb7e5e97c66630d1db36ce63ff76a14198797993bb8170b2d07cccb45992431c"} Oct 01 13:27:04 crc kubenswrapper[4749]: I1001 13:27:04.867557 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="468beb78-1358-4a1b-ad2c-3941f3f270c6" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.198:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:27:05 crc kubenswrapper[4749]: I1001 13:27:05.133118 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-68c474b9fc-mn4ts" podUID="b34ccf83-0af2-438a-ad7f-c79c6886db75" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.177:5353: i/o timeout" Oct 01 13:27:05 crc kubenswrapper[4749]: I1001 13:27:05.734916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"005bd9d5-4799-4763-aa6b-46a9341c36d2","Type":"ContainerStarted","Data":"0c37dd64690d250f2f4925b438d3660162199e1e07e470c9202921196bbf3b09"} Oct 01 13:27:05 crc kubenswrapper[4749]: I1001 13:27:05.858440 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="468beb78-1358-4a1b-ad2c-3941f3f270c6" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.198:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:27:07 crc kubenswrapper[4749]: I1001 13:27:07.258343 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 01 13:27:08 crc kubenswrapper[4749]: I1001 13:27:08.776790 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5","Type":"ContainerStarted","Data":"b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8"} Oct 01 13:27:08 crc kubenswrapper[4749]: I1001 13:27:08.777548 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:27:08 crc kubenswrapper[4749]: I1001 13:27:08.806104 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.875351006 podStartE2EDuration="26.806084348s" podCreationTimestamp="2025-10-01 13:26:42 +0000 UTC" firstStartedPulling="2025-10-01 13:26:43.428631562 +0000 UTC m=+1263.482616461" lastFinishedPulling="2025-10-01 13:27:08.359364884 +0000 UTC m=+1288.413349803" observedRunningTime="2025-10-01 13:27:08.800860643 +0000 UTC m=+1288.854845592" watchObservedRunningTime="2025-10-01 13:27:08.806084348 +0000 UTC m=+1288.860069247" Oct 01 13:27:09 crc kubenswrapper[4749]: I1001 13:27:09.791254 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"005bd9d5-4799-4763-aa6b-46a9341c36d2","Type":"ContainerStarted","Data":"b22ac331d91cd4f277fbaea9882df0ad8c67ffd7710e3136cfc562a273a0c8df"} Oct 01 13:27:13 crc kubenswrapper[4749]: I1001 13:27:13.146481 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 13:27:13 crc kubenswrapper[4749]: I1001 13:27:13.331524 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 13:27:13 crc kubenswrapper[4749]: I1001 13:27:13.355692 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=11.355670763 podStartE2EDuration="11.355670763s" podCreationTimestamp="2025-10-01 13:27:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:27:09.822880631 +0000 UTC m=+1289.876865530" watchObservedRunningTime="2025-10-01 13:27:13.355670763 +0000 UTC m=+1293.409655662" Oct 01 13:27:13 crc kubenswrapper[4749]: I1001 13:27:13.755813 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:27:13 crc kubenswrapper[4749]: I1001 13:27:13.756120 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="ceilometer-central-agent" containerID="cri-o://d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2" gracePeriod=30 Oct 01 13:27:13 crc kubenswrapper[4749]: I1001 13:27:13.756312 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="proxy-httpd" containerID="cri-o://b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8" gracePeriod=30 Oct 01 13:27:13 crc kubenswrapper[4749]: I1001 13:27:13.756371 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="sg-core" containerID="cri-o://f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a" gracePeriod=30 Oct 01 13:27:13 crc kubenswrapper[4749]: I1001 13:27:13.756423 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="ceilometer-notification-agent" containerID="cri-o://26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f" gracePeriod=30 Oct 01 13:27:14 crc kubenswrapper[4749]: I1001 13:27:14.847643 4749 generic.go:334] "Generic (PLEG): container finished" podID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerID="b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8" exitCode=0 Oct 01 13:27:14 crc kubenswrapper[4749]: I1001 13:27:14.847899 4749 generic.go:334] "Generic (PLEG): container finished" podID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerID="f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a" exitCode=2 Oct 01 13:27:14 crc kubenswrapper[4749]: I1001 13:27:14.847907 4749 generic.go:334] "Generic (PLEG): container finished" podID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerID="d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2" exitCode=0 Oct 01 13:27:14 crc kubenswrapper[4749]: I1001 13:27:14.847733 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5","Type":"ContainerDied","Data":"b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8"} Oct 01 13:27:14 crc kubenswrapper[4749]: I1001 13:27:14.847944 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5","Type":"ContainerDied","Data":"f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a"} Oct 01 13:27:14 crc kubenswrapper[4749]: I1001 13:27:14.847961 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5","Type":"ContainerDied","Data":"d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2"} Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.787228 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.848544 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-combined-ca-bundle\") pod \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.848659 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-log-httpd\") pod \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.848678 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-scripts\") pod \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.848762 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hpqq\" (UniqueName: \"kubernetes.io/projected/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-kube-api-access-9hpqq\") pod \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.848797 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-sg-core-conf-yaml\") pod \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.848894 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-config-data\") pod \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.848919 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-run-httpd\") pod \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\" (UID: \"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5\") " Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.849374 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" (UID: "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.849467 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" (UID: "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.858387 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-scripts" (OuterVolumeSpecName: "scripts") pod "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" (UID: "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.860857 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-kube-api-access-9hpqq" (OuterVolumeSpecName: "kube-api-access-9hpqq") pod "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" (UID: "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5"). InnerVolumeSpecName "kube-api-access-9hpqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.877314 4749 generic.go:334] "Generic (PLEG): container finished" podID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerID="26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f" exitCode=0 Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.877379 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5","Type":"ContainerDied","Data":"26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f"} Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.877391 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.877419 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4f73fdbc-2d5f-4770-856b-a2a180bfd6c5","Type":"ContainerDied","Data":"e313525b9305daf3015b829dff530c9016b62be57528e6d937698595a19208fe"} Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.877448 4749 scope.go:117] "RemoveContainer" containerID="b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.884459 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" (UID: "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.940274 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" (UID: "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.952068 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.952099 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.952108 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.952116 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hpqq\" (UniqueName: \"kubernetes.io/projected/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-kube-api-access-9hpqq\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.952126 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.952135 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.986472 4749 scope.go:117] "RemoveContainer" containerID="f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a" Oct 01 13:27:15 crc kubenswrapper[4749]: I1001 13:27:15.994040 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-config-data" (OuterVolumeSpecName: "config-data") pod "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" (UID: "4f73fdbc-2d5f-4770-856b-a2a180bfd6c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.009691 4749 scope.go:117] "RemoveContainer" containerID="26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.028669 4749 scope.go:117] "RemoveContainer" containerID="d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.053618 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.060305 4749 scope.go:117] "RemoveContainer" containerID="b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8" Oct 01 13:27:16 crc kubenswrapper[4749]: E1001 13:27:16.060895 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8\": container with ID starting with b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8 not found: ID does not exist" containerID="b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.060941 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8"} err="failed to get container status \"b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8\": rpc error: code = NotFound desc = could not find container \"b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8\": container with ID starting with b456c908d4c0e8f8b17a81272f58b4fd58cd600fab386d9f7e6998c915d865c8 not found: ID does not exist" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.060969 4749 scope.go:117] "RemoveContainer" containerID="f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a" Oct 01 13:27:16 crc kubenswrapper[4749]: E1001 13:27:16.061461 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a\": container with ID starting with f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a not found: ID does not exist" containerID="f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.061488 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a"} err="failed to get container status \"f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a\": rpc error: code = NotFound desc = could not find container \"f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a\": container with ID starting with f287115c4571264d56036b61887a0c4461168c7e9358b6696e057b731a795f1a not found: ID does not exist" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.061507 4749 scope.go:117] "RemoveContainer" containerID="26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f" Oct 01 13:27:16 crc kubenswrapper[4749]: E1001 13:27:16.061804 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f\": container with ID starting with 26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f not found: ID does not exist" containerID="26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.061881 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f"} err="failed to get container status \"26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f\": rpc error: code = NotFound desc = could not find container \"26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f\": container with ID starting with 26b7b86dc6ea150a76be6af35a856c62c8d161a61677cb87eb289bd75d4e477f not found: ID does not exist" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.061908 4749 scope.go:117] "RemoveContainer" containerID="d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2" Oct 01 13:27:16 crc kubenswrapper[4749]: E1001 13:27:16.062177 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2\": container with ID starting with d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2 not found: ID does not exist" containerID="d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.062203 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2"} err="failed to get container status \"d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2\": rpc error: code = NotFound desc = could not find container \"d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2\": container with ID starting with d2b45539c9f9171acfd3841fae2bc341e4002bcf222424590f4a6138bed30be2 not found: ID does not exist" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.223166 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.241260 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.269923 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:27:16 crc kubenswrapper[4749]: E1001 13:27:16.270564 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="ceilometer-notification-agent" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.270596 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="ceilometer-notification-agent" Oct 01 13:27:16 crc kubenswrapper[4749]: E1001 13:27:16.270622 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="sg-core" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.270632 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="sg-core" Oct 01 13:27:16 crc kubenswrapper[4749]: E1001 13:27:16.270648 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="ceilometer-central-agent" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.270655 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="ceilometer-central-agent" Oct 01 13:27:16 crc kubenswrapper[4749]: E1001 13:27:16.270667 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="proxy-httpd" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.270675 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="proxy-httpd" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.270957 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="proxy-httpd" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.270982 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="ceilometer-central-agent" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.270994 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="ceilometer-notification-agent" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.271029 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" containerName="sg-core" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.275068 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.281537 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.284288 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.284599 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.360354 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-scripts\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.361002 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/729e9ae9-3bf2-4e33-95e4-33f96da93660-run-httpd\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.361293 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.361680 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/729e9ae9-3bf2-4e33-95e4-33f96da93660-log-httpd\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.362048 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fxzj\" (UniqueName: \"kubernetes.io/projected/729e9ae9-3bf2-4e33-95e4-33f96da93660-kube-api-access-5fxzj\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.362426 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-config-data\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.362722 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.465184 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-config-data\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.465280 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.465418 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-scripts\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.465633 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/729e9ae9-3bf2-4e33-95e4-33f96da93660-run-httpd\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.465699 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.465859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/729e9ae9-3bf2-4e33-95e4-33f96da93660-log-httpd\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.466012 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fxzj\" (UniqueName: \"kubernetes.io/projected/729e9ae9-3bf2-4e33-95e4-33f96da93660-kube-api-access-5fxzj\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.466983 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/729e9ae9-3bf2-4e33-95e4-33f96da93660-run-httpd\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.467214 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/729e9ae9-3bf2-4e33-95e4-33f96da93660-log-httpd\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.472212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-scripts\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.472261 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.473807 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-config-data\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.473879 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.492293 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fxzj\" (UniqueName: \"kubernetes.io/projected/729e9ae9-3bf2-4e33-95e4-33f96da93660-kube-api-access-5fxzj\") pod \"ceilometer-0\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " pod="openstack/ceilometer-0" Oct 01 13:27:16 crc kubenswrapper[4749]: I1001 13:27:16.612324 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:27:17 crc kubenswrapper[4749]: I1001 13:27:17.138917 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:27:17 crc kubenswrapper[4749]: W1001 13:27:17.149464 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod729e9ae9_3bf2_4e33_95e4_33f96da93660.slice/crio-2a1126e0e98ad9c97d4ded14134109d41feeed7d89fab8b646a827a3c0fc8b18 WatchSource:0}: Error finding container 2a1126e0e98ad9c97d4ded14134109d41feeed7d89fab8b646a827a3c0fc8b18: Status 404 returned error can't find the container with id 2a1126e0e98ad9c97d4ded14134109d41feeed7d89fab8b646a827a3c0fc8b18 Oct 01 13:27:17 crc kubenswrapper[4749]: I1001 13:27:17.252944 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f73fdbc-2d5f-4770-856b-a2a180bfd6c5" path="/var/lib/kubelet/pods/4f73fdbc-2d5f-4770-856b-a2a180bfd6c5/volumes" Oct 01 13:27:17 crc kubenswrapper[4749]: I1001 13:27:17.920272 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ts66r" event={"ID":"a63bfe18-de5a-44e8-abef-17ee0f2af92a","Type":"ContainerStarted","Data":"b0121f7354627d9fa574df9e28d73e13f44cdb1be6512aed3224e1dfb5cb8e07"} Oct 01 13:27:17 crc kubenswrapper[4749]: I1001 13:27:17.923498 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"729e9ae9-3bf2-4e33-95e4-33f96da93660","Type":"ContainerStarted","Data":"5ac823d6a17967ad62e81f6352f956bc3f3ae0ec1b539bf76cd6533923a16b8d"} Oct 01 13:27:17 crc kubenswrapper[4749]: I1001 13:27:17.923543 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"729e9ae9-3bf2-4e33-95e4-33f96da93660","Type":"ContainerStarted","Data":"2a1126e0e98ad9c97d4ded14134109d41feeed7d89fab8b646a827a3c0fc8b18"} Oct 01 13:27:17 crc kubenswrapper[4749]: I1001 13:27:17.941240 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ts66r" podStartSLOduration=1.618003577 podStartE2EDuration="36.941197413s" podCreationTimestamp="2025-10-01 13:26:41 +0000 UTC" firstStartedPulling="2025-10-01 13:26:42.004826441 +0000 UTC m=+1262.058811340" lastFinishedPulling="2025-10-01 13:27:17.328020267 +0000 UTC m=+1297.382005176" observedRunningTime="2025-10-01 13:27:17.938608596 +0000 UTC m=+1297.992593495" watchObservedRunningTime="2025-10-01 13:27:17.941197413 +0000 UTC m=+1297.995182342" Oct 01 13:27:18 crc kubenswrapper[4749]: I1001 13:27:18.940916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"729e9ae9-3bf2-4e33-95e4-33f96da93660","Type":"ContainerStarted","Data":"63038e8b4c7657fa884dc3d46ca1ada1c7c8a35c97bcf88c126efcf6c5f02c57"} Oct 01 13:27:18 crc kubenswrapper[4749]: I1001 13:27:18.940983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"729e9ae9-3bf2-4e33-95e4-33f96da93660","Type":"ContainerStarted","Data":"440441178e80869539f16ee0b9e3dc35beeebb8ce05fb21f5cf5248df128a6c3"} Oct 01 13:27:20 crc kubenswrapper[4749]: I1001 13:27:20.973466 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"729e9ae9-3bf2-4e33-95e4-33f96da93660","Type":"ContainerStarted","Data":"2ccd41c2c96f472955ca03754b9527ab834fc368b5d0f02db355abaa8c24fb9e"} Oct 01 13:27:20 crc kubenswrapper[4749]: I1001 13:27:20.975351 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:27:21 crc kubenswrapper[4749]: I1001 13:27:21.008110 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.163291812 podStartE2EDuration="5.008094071s" podCreationTimestamp="2025-10-01 13:27:16 +0000 UTC" firstStartedPulling="2025-10-01 13:27:17.153883158 +0000 UTC m=+1297.207868097" lastFinishedPulling="2025-10-01 13:27:19.998685447 +0000 UTC m=+1300.052670356" observedRunningTime="2025-10-01 13:27:21.002823175 +0000 UTC m=+1301.056808074" watchObservedRunningTime="2025-10-01 13:27:21.008094071 +0000 UTC m=+1301.062078970" Oct 01 13:27:32 crc kubenswrapper[4749]: I1001 13:27:32.106447 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:27:32 crc kubenswrapper[4749]: I1001 13:27:32.108441 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:27:33 crc kubenswrapper[4749]: I1001 13:27:33.117740 4749 generic.go:334] "Generic (PLEG): container finished" podID="a63bfe18-de5a-44e8-abef-17ee0f2af92a" containerID="b0121f7354627d9fa574df9e28d73e13f44cdb1be6512aed3224e1dfb5cb8e07" exitCode=0 Oct 01 13:27:33 crc kubenswrapper[4749]: I1001 13:27:33.117815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ts66r" event={"ID":"a63bfe18-de5a-44e8-abef-17ee0f2af92a","Type":"ContainerDied","Data":"b0121f7354627d9fa574df9e28d73e13f44cdb1be6512aed3224e1dfb5cb8e07"} Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.569076 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.759108 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-config-data\") pod \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.759337 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqdk7\" (UniqueName: \"kubernetes.io/projected/a63bfe18-de5a-44e8-abef-17ee0f2af92a-kube-api-access-wqdk7\") pod \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.759427 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-combined-ca-bundle\") pod \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.759542 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-scripts\") pod \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\" (UID: \"a63bfe18-de5a-44e8-abef-17ee0f2af92a\") " Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.765385 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a63bfe18-de5a-44e8-abef-17ee0f2af92a-kube-api-access-wqdk7" (OuterVolumeSpecName: "kube-api-access-wqdk7") pod "a63bfe18-de5a-44e8-abef-17ee0f2af92a" (UID: "a63bfe18-de5a-44e8-abef-17ee0f2af92a"). InnerVolumeSpecName "kube-api-access-wqdk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.767581 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-scripts" (OuterVolumeSpecName: "scripts") pod "a63bfe18-de5a-44e8-abef-17ee0f2af92a" (UID: "a63bfe18-de5a-44e8-abef-17ee0f2af92a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.795508 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a63bfe18-de5a-44e8-abef-17ee0f2af92a" (UID: "a63bfe18-de5a-44e8-abef-17ee0f2af92a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.815176 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-config-data" (OuterVolumeSpecName: "config-data") pod "a63bfe18-de5a-44e8-abef-17ee0f2af92a" (UID: "a63bfe18-de5a-44e8-abef-17ee0f2af92a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.862419 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.862464 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqdk7\" (UniqueName: \"kubernetes.io/projected/a63bfe18-de5a-44e8-abef-17ee0f2af92a-kube-api-access-wqdk7\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.862483 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:34 crc kubenswrapper[4749]: I1001 13:27:34.862497 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a63bfe18-de5a-44e8-abef-17ee0f2af92a-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.155293 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ts66r" event={"ID":"a63bfe18-de5a-44e8-abef-17ee0f2af92a","Type":"ContainerDied","Data":"a405f8074ddbe52f56cc7388b9917f032dc3c40fc4849e790f5f1e39da4fa748"} Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.155341 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a405f8074ddbe52f56cc7388b9917f032dc3c40fc4849e790f5f1e39da4fa748" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.155359 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ts66r" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.306436 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 13:27:35 crc kubenswrapper[4749]: E1001 13:27:35.307579 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a63bfe18-de5a-44e8-abef-17ee0f2af92a" containerName="nova-cell0-conductor-db-sync" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.307613 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a63bfe18-de5a-44e8-abef-17ee0f2af92a" containerName="nova-cell0-conductor-db-sync" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.307974 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a63bfe18-de5a-44e8-abef-17ee0f2af92a" containerName="nova-cell0-conductor-db-sync" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.309326 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.311888 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-fdsw2" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.313347 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.320749 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.377073 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.377121 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.377233 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5wwr\" (UniqueName: \"kubernetes.io/projected/e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c-kube-api-access-p5wwr\") pod \"nova-cell0-conductor-0\" (UID: \"e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.479260 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.479425 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.479566 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5wwr\" (UniqueName: \"kubernetes.io/projected/e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c-kube-api-access-p5wwr\") pod \"nova-cell0-conductor-0\" (UID: \"e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.486180 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.488009 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.508504 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5wwr\" (UniqueName: \"kubernetes.io/projected/e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c-kube-api-access-p5wwr\") pod \"nova-cell0-conductor-0\" (UID: \"e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c\") " pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:35 crc kubenswrapper[4749]: I1001 13:27:35.630131 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:36 crc kubenswrapper[4749]: I1001 13:27:36.095196 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 13:27:36 crc kubenswrapper[4749]: I1001 13:27:36.168139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c","Type":"ContainerStarted","Data":"810b80717f4840f4d39efc5428fc4a99b9dd74a4fc04c3419ad7c5a19decdbf0"} Oct 01 13:27:37 crc kubenswrapper[4749]: I1001 13:27:37.182987 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c","Type":"ContainerStarted","Data":"8ab015c82eb7cc6f4aeef5103fb984200b81a257493983e0243b59b3f538a75c"} Oct 01 13:27:37 crc kubenswrapper[4749]: I1001 13:27:37.184981 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:37 crc kubenswrapper[4749]: I1001 13:27:37.211799 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.211771857 podStartE2EDuration="2.211771857s" podCreationTimestamp="2025-10-01 13:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:27:37.202039098 +0000 UTC m=+1317.256024057" watchObservedRunningTime="2025-10-01 13:27:37.211771857 +0000 UTC m=+1317.265756796" Oct 01 13:27:45 crc kubenswrapper[4749]: I1001 13:27:45.681510 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.329037 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-tkdsq"] Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.330871 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.334687 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.334908 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.342070 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tkdsq"] Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.437460 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.440959 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.445902 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.452875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcfml\" (UniqueName: \"kubernetes.io/projected/0dcfec60-454f-4182-b53d-280d182dee40-kube-api-access-gcfml\") pod \"nova-cell0-cell-mapping-tkdsq\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.452943 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-scripts\") pod \"nova-cell0-cell-mapping-tkdsq\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.452964 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tkdsq\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.452989 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-config-data\") pod \"nova-cell0-cell-mapping-tkdsq\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.458347 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.516311 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.519117 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.521375 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.557414 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2khqk\" (UniqueName: \"kubernetes.io/projected/8a209604-f49d-4844-a505-c3cb9202fb13-kube-api-access-2khqk\") pod \"nova-api-0\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.557509 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcfml\" (UniqueName: \"kubernetes.io/projected/0dcfec60-454f-4182-b53d-280d182dee40-kube-api-access-gcfml\") pod \"nova-cell0-cell-mapping-tkdsq\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.557543 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a209604-f49d-4844-a505-c3cb9202fb13-config-data\") pod \"nova-api-0\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.557581 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-scripts\") pod \"nova-cell0-cell-mapping-tkdsq\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.557600 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tkdsq\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.557629 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-config-data\") pod \"nova-cell0-cell-mapping-tkdsq\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.557650 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a209604-f49d-4844-a505-c3cb9202fb13-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.557698 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a209604-f49d-4844-a505-c3cb9202fb13-logs\") pod \"nova-api-0\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.577097 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-scripts\") pod \"nova-cell0-cell-mapping-tkdsq\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.578480 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-tkdsq\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.587571 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.607123 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-config-data\") pod \"nova-cell0-cell-mapping-tkdsq\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.643480 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcfml\" (UniqueName: \"kubernetes.io/projected/0dcfec60-454f-4182-b53d-280d182dee40-kube-api-access-gcfml\") pod \"nova-cell0-cell-mapping-tkdsq\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.659902 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a209604-f49d-4844-a505-c3cb9202fb13-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.659966 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a209604-f49d-4844-a505-c3cb9202fb13-logs\") pod \"nova-api-0\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.660002 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2khqk\" (UniqueName: \"kubernetes.io/projected/8a209604-f49d-4844-a505-c3cb9202fb13-kube-api-access-2khqk\") pod \"nova-api-0\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.660079 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a66324-287f-4459-b601-e79c152c1ba0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"05a66324-287f-4459-b601-e79c152c1ba0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.660103 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a209604-f49d-4844-a505-c3cb9202fb13-config-data\") pod \"nova-api-0\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.660132 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkzcs\" (UniqueName: \"kubernetes.io/projected/05a66324-287f-4459-b601-e79c152c1ba0-kube-api-access-vkzcs\") pod \"nova-cell1-novncproxy-0\" (UID: \"05a66324-287f-4459-b601-e79c152c1ba0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.660163 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a66324-287f-4459-b601-e79c152c1ba0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"05a66324-287f-4459-b601-e79c152c1ba0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.676865 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a209604-f49d-4844-a505-c3cb9202fb13-logs\") pod \"nova-api-0\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.683903 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.685071 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a209604-f49d-4844-a505-c3cb9202fb13-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.685106 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.689888 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a209604-f49d-4844-a505-c3cb9202fb13-config-data\") pod \"nova-api-0\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.698528 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.700277 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.708770 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.724994 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2khqk\" (UniqueName: \"kubernetes.io/projected/8a209604-f49d-4844-a505-c3cb9202fb13-kube-api-access-2khqk\") pod \"nova-api-0\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.749278 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.750591 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.760190 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.761666 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a66324-287f-4459-b601-e79c152c1ba0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"05a66324-287f-4459-b601-e79c152c1ba0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.761711 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkzcs\" (UniqueName: \"kubernetes.io/projected/05a66324-287f-4459-b601-e79c152c1ba0-kube-api-access-vkzcs\") pod \"nova-cell1-novncproxy-0\" (UID: \"05a66324-287f-4459-b601-e79c152c1ba0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.761748 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a66324-287f-4459-b601-e79c152c1ba0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"05a66324-287f-4459-b601-e79c152c1ba0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.772285 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.775142 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a66324-287f-4459-b601-e79c152c1ba0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"05a66324-287f-4459-b601-e79c152c1ba0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.775789 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a66324-287f-4459-b601-e79c152c1ba0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"05a66324-287f-4459-b601-e79c152c1ba0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.782916 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.790356 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkzcs\" (UniqueName: \"kubernetes.io/projected/05a66324-287f-4459-b601-e79c152c1ba0-kube-api-access-vkzcs\") pod \"nova-cell1-novncproxy-0\" (UID: \"05a66324-287f-4459-b601-e79c152c1ba0\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.796687 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.814256 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-599b6fdf6c-xm8tc"] Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.815813 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.819757 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.833159 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-599b6fdf6c-xm8tc"] Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.870123 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnwql\" (UniqueName: \"kubernetes.io/projected/79c6d349-9c67-41bf-b71c-8a1ce73e765e-kube-api-access-rnwql\") pod \"nova-scheduler-0\" (UID: \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\") " pod="openstack/nova-scheduler-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.870194 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8deccf1-954a-4b62-bb58-b0221689240e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.870317 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8deccf1-954a-4b62-bb58-b0221689240e-config-data\") pod \"nova-metadata-0\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.870398 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c6d349-9c67-41bf-b71c-8a1ce73e765e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\") " pod="openstack/nova-scheduler-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.870530 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c6d349-9c67-41bf-b71c-8a1ce73e765e-config-data\") pod \"nova-scheduler-0\" (UID: \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\") " pod="openstack/nova-scheduler-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.870575 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8deccf1-954a-4b62-bb58-b0221689240e-logs\") pod \"nova-metadata-0\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.870653 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt2bb\" (UniqueName: \"kubernetes.io/projected/c8deccf1-954a-4b62-bb58-b0221689240e-kube-api-access-qt2bb\") pod \"nova-metadata-0\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973528 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-ovsdbserver-nb\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mshtn\" (UniqueName: \"kubernetes.io/projected/67837f8f-64b7-46b6-868c-8a9abb273f36-kube-api-access-mshtn\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973641 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-ovsdbserver-sb\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973679 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnwql\" (UniqueName: \"kubernetes.io/projected/79c6d349-9c67-41bf-b71c-8a1ce73e765e-kube-api-access-rnwql\") pod \"nova-scheduler-0\" (UID: \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\") " pod="openstack/nova-scheduler-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973728 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8deccf1-954a-4b62-bb58-b0221689240e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973751 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8deccf1-954a-4b62-bb58-b0221689240e-config-data\") pod \"nova-metadata-0\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973780 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c6d349-9c67-41bf-b71c-8a1ce73e765e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\") " pod="openstack/nova-scheduler-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973819 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-config\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973853 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c6d349-9c67-41bf-b71c-8a1ce73e765e-config-data\") pod \"nova-scheduler-0\" (UID: \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\") " pod="openstack/nova-scheduler-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973876 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8deccf1-954a-4b62-bb58-b0221689240e-logs\") pod \"nova-metadata-0\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973895 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-dns-swift-storage-0\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973918 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt2bb\" (UniqueName: \"kubernetes.io/projected/c8deccf1-954a-4b62-bb58-b0221689240e-kube-api-access-qt2bb\") pod \"nova-metadata-0\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.973945 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-dns-svc\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.987043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8deccf1-954a-4b62-bb58-b0221689240e-config-data\") pod \"nova-metadata-0\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.988622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8deccf1-954a-4b62-bb58-b0221689240e-logs\") pod \"nova-metadata-0\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.994620 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c6d349-9c67-41bf-b71c-8a1ce73e765e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\") " pod="openstack/nova-scheduler-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.994646 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt2bb\" (UniqueName: \"kubernetes.io/projected/c8deccf1-954a-4b62-bb58-b0221689240e-kube-api-access-qt2bb\") pod \"nova-metadata-0\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.997915 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnwql\" (UniqueName: \"kubernetes.io/projected/79c6d349-9c67-41bf-b71c-8a1ce73e765e-kube-api-access-rnwql\") pod \"nova-scheduler-0\" (UID: \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\") " pod="openstack/nova-scheduler-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.999258 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c6d349-9c67-41bf-b71c-8a1ce73e765e-config-data\") pod \"nova-scheduler-0\" (UID: \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\") " pod="openstack/nova-scheduler-0" Oct 01 13:27:46 crc kubenswrapper[4749]: I1001 13:27:46.999706 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8deccf1-954a-4b62-bb58-b0221689240e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.075918 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-ovsdbserver-sb\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.076059 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-config\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.076120 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-dns-swift-storage-0\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.076159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-dns-svc\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.076204 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-ovsdbserver-nb\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.076267 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mshtn\" (UniqueName: \"kubernetes.io/projected/67837f8f-64b7-46b6-868c-8a9abb273f36-kube-api-access-mshtn\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.077460 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-ovsdbserver-sb\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.078060 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-config\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.078738 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-dns-swift-storage-0\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.079024 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-dns-svc\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.079036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-ovsdbserver-nb\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.095568 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mshtn\" (UniqueName: \"kubernetes.io/projected/67837f8f-64b7-46b6-868c-8a9abb273f36-kube-api-access-mshtn\") pod \"dnsmasq-dns-599b6fdf6c-xm8tc\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.148822 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.176406 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.189808 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:47 crc kubenswrapper[4749]: W1001 13:27:47.383114 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dcfec60_454f_4182_b53d_280d182dee40.slice/crio-ab9fb7441fe2767d5ddf38208bd44080d6d7f8ec8b780b6d91e2bc584df81bcb WatchSource:0}: Error finding container ab9fb7441fe2767d5ddf38208bd44080d6d7f8ec8b780b6d91e2bc584df81bcb: Status 404 returned error can't find the container with id ab9fb7441fe2767d5ddf38208bd44080d6d7f8ec8b780b6d91e2bc584df81bcb Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.389150 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-tkdsq"] Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.408895 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.559634 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.798194 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5b99w"] Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.800629 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.803622 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.803785 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.846701 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5b99w"] Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.870912 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.900668 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5b99w\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.900952 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-scripts\") pod \"nova-cell1-conductor-db-sync-5b99w\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.901078 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clk2b\" (UniqueName: \"kubernetes.io/projected/46286372-30ef-489e-8076-a65ad341d010-kube-api-access-clk2b\") pod \"nova-cell1-conductor-db-sync-5b99w\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.901111 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-config-data\") pod \"nova-cell1-conductor-db-sync-5b99w\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.916147 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:27:47 crc kubenswrapper[4749]: I1001 13:27:47.926961 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-599b6fdf6c-xm8tc"] Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.002926 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clk2b\" (UniqueName: \"kubernetes.io/projected/46286372-30ef-489e-8076-a65ad341d010-kube-api-access-clk2b\") pod \"nova-cell1-conductor-db-sync-5b99w\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.002982 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-config-data\") pod \"nova-cell1-conductor-db-sync-5b99w\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.003028 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5b99w\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.003057 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-scripts\") pod \"nova-cell1-conductor-db-sync-5b99w\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.007974 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-scripts\") pod \"nova-cell1-conductor-db-sync-5b99w\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.009693 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5b99w\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.009770 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-config-data\") pod \"nova-cell1-conductor-db-sync-5b99w\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.018019 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clk2b\" (UniqueName: \"kubernetes.io/projected/46286372-30ef-489e-8076-a65ad341d010-kube-api-access-clk2b\") pod \"nova-cell1-conductor-db-sync-5b99w\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.231015 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.352274 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c8deccf1-954a-4b62-bb58-b0221689240e","Type":"ContainerStarted","Data":"c3955b1d8765646f0a91f20ea9980ca8dd8fe7326a77fb6de475b12b449b685a"} Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.358271 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tkdsq" event={"ID":"0dcfec60-454f-4182-b53d-280d182dee40","Type":"ContainerStarted","Data":"369d9fd8645d6a719de992e7103cc6537fde49bf3dcf4f69544d82d1b2978f28"} Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.358315 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tkdsq" event={"ID":"0dcfec60-454f-4182-b53d-280d182dee40","Type":"ContainerStarted","Data":"ab9fb7441fe2767d5ddf38208bd44080d6d7f8ec8b780b6d91e2bc584df81bcb"} Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.362154 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a209604-f49d-4844-a505-c3cb9202fb13","Type":"ContainerStarted","Data":"9ecb5328eb20e397503d960e80734f093f9a85268aaf815bc5c6420f27005b46"} Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.367156 4749 generic.go:334] "Generic (PLEG): container finished" podID="67837f8f-64b7-46b6-868c-8a9abb273f36" containerID="3d9bba667ef9acca67b52baaba8b9de703baf6dc622f09faa22af62d7fb18c73" exitCode=0 Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.367255 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" event={"ID":"67837f8f-64b7-46b6-868c-8a9abb273f36","Type":"ContainerDied","Data":"3d9bba667ef9acca67b52baaba8b9de703baf6dc622f09faa22af62d7fb18c73"} Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.367320 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" event={"ID":"67837f8f-64b7-46b6-868c-8a9abb273f36","Type":"ContainerStarted","Data":"377c4c29e43f43e62e8dc07e827cf5dcb890618dbf57c449692d4302e9b57661"} Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.369182 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05a66324-287f-4459-b601-e79c152c1ba0","Type":"ContainerStarted","Data":"f59ca6910cc1e1541e138444fe2167dfaf50e78680a2716b8d5c3e0a2f33e769"} Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.371305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"79c6d349-9c67-41bf-b71c-8a1ce73e765e","Type":"ContainerStarted","Data":"5cdbc4c03fa7bdcd216e7956fa27da2eb18208913cbf61e27c23c17ce004a0d4"} Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.378369 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-tkdsq" podStartSLOduration=2.378353484 podStartE2EDuration="2.378353484s" podCreationTimestamp="2025-10-01 13:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:27:48.37686335 +0000 UTC m=+1328.430848239" watchObservedRunningTime="2025-10-01 13:27:48.378353484 +0000 UTC m=+1328.432338383" Oct 01 13:27:48 crc kubenswrapper[4749]: I1001 13:27:48.728780 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5b99w"] Oct 01 13:27:49 crc kubenswrapper[4749]: I1001 13:27:49.388813 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5b99w" event={"ID":"46286372-30ef-489e-8076-a65ad341d010","Type":"ContainerStarted","Data":"f9f9523a2e38d74c886e884ded66fdacecce1bfb6bb065c69d5f4ddf81901111"} Oct 01 13:27:50 crc kubenswrapper[4749]: I1001 13:27:50.168413 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:27:50 crc kubenswrapper[4749]: I1001 13:27:50.183036 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.459388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"79c6d349-9c67-41bf-b71c-8a1ce73e765e","Type":"ContainerStarted","Data":"723d9046411404939ef327a949756f01e2d9f84315b59d8d17355d1b64f598e9"} Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.461679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c8deccf1-954a-4b62-bb58-b0221689240e","Type":"ContainerStarted","Data":"7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614"} Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.461723 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c8deccf1-954a-4b62-bb58-b0221689240e","Type":"ContainerStarted","Data":"805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410"} Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.461898 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c8deccf1-954a-4b62-bb58-b0221689240e" containerName="nova-metadata-log" containerID="cri-o://805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410" gracePeriod=30 Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.462181 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c8deccf1-954a-4b62-bb58-b0221689240e" containerName="nova-metadata-metadata" containerID="cri-o://7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614" gracePeriod=30 Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.473943 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a209604-f49d-4844-a505-c3cb9202fb13","Type":"ContainerStarted","Data":"5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858"} Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.474113 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a209604-f49d-4844-a505-c3cb9202fb13","Type":"ContainerStarted","Data":"269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84"} Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.481701 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" event={"ID":"67837f8f-64b7-46b6-868c-8a9abb273f36","Type":"ContainerStarted","Data":"f7b72765801a8a7084e320c5f82aec38e7861ac383f821b991f8545bdb648673"} Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.482011 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.486288 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.831468719 podStartE2EDuration="6.486266824s" podCreationTimestamp="2025-10-01 13:27:46 +0000 UTC" firstStartedPulling="2025-10-01 13:27:47.851177577 +0000 UTC m=+1327.905162476" lastFinishedPulling="2025-10-01 13:27:51.505975682 +0000 UTC m=+1331.559960581" observedRunningTime="2025-10-01 13:27:52.480590276 +0000 UTC m=+1332.534575175" watchObservedRunningTime="2025-10-01 13:27:52.486266824 +0000 UTC m=+1332.540251723" Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.487167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05a66324-287f-4459-b601-e79c152c1ba0","Type":"ContainerStarted","Data":"a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21"} Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.487209 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="05a66324-287f-4459-b601-e79c152c1ba0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21" gracePeriod=30 Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.489382 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5b99w" event={"ID":"46286372-30ef-489e-8076-a65ad341d010","Type":"ContainerStarted","Data":"bc9e246cd99f270946e29c2281e7926f9b778fd8bcc539e8e67528d2e2d9d7e2"} Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.513116 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.816834506 podStartE2EDuration="6.513090699s" podCreationTimestamp="2025-10-01 13:27:46 +0000 UTC" firstStartedPulling="2025-10-01 13:27:47.837228864 +0000 UTC m=+1327.891213763" lastFinishedPulling="2025-10-01 13:27:51.533485057 +0000 UTC m=+1331.587469956" observedRunningTime="2025-10-01 13:27:52.500024332 +0000 UTC m=+1332.554009231" watchObservedRunningTime="2025-10-01 13:27:52.513090699 +0000 UTC m=+1332.567075598" Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.526903 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.434478597 podStartE2EDuration="6.526881517s" podCreationTimestamp="2025-10-01 13:27:46 +0000 UTC" firstStartedPulling="2025-10-01 13:27:47.416105177 +0000 UTC m=+1327.470090076" lastFinishedPulling="2025-10-01 13:27:51.508508057 +0000 UTC m=+1331.562492996" observedRunningTime="2025-10-01 13:27:52.516270043 +0000 UTC m=+1332.570254952" watchObservedRunningTime="2025-10-01 13:27:52.526881517 +0000 UTC m=+1332.580866416" Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.556284 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" podStartSLOduration=6.556256518 podStartE2EDuration="6.556256518s" podCreationTimestamp="2025-10-01 13:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:27:52.554147545 +0000 UTC m=+1332.608132464" watchObservedRunningTime="2025-10-01 13:27:52.556256518 +0000 UTC m=+1332.610241437" Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.564705 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.6806196890000002 podStartE2EDuration="6.564683007s" podCreationTimestamp="2025-10-01 13:27:46 +0000 UTC" firstStartedPulling="2025-10-01 13:27:47.653725427 +0000 UTC m=+1327.707710316" lastFinishedPulling="2025-10-01 13:27:51.537788735 +0000 UTC m=+1331.591773634" observedRunningTime="2025-10-01 13:27:52.540955464 +0000 UTC m=+1332.594940403" watchObservedRunningTime="2025-10-01 13:27:52.564683007 +0000 UTC m=+1332.618667916" Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.575339 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-5b99w" podStartSLOduration=5.575316722 podStartE2EDuration="5.575316722s" podCreationTimestamp="2025-10-01 13:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:27:52.569310204 +0000 UTC m=+1332.623295123" watchObservedRunningTime="2025-10-01 13:27:52.575316722 +0000 UTC m=+1332.629301631" Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.891107 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:27:52 crc kubenswrapper[4749]: I1001 13:27:52.891490 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e2054662-5786-4a04-a7c9-16fe32a04610" containerName="kube-state-metrics" containerID="cri-o://2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090" gracePeriod=30 Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.181282 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.370803 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8deccf1-954a-4b62-bb58-b0221689240e-config-data\") pod \"c8deccf1-954a-4b62-bb58-b0221689240e\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.370995 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8deccf1-954a-4b62-bb58-b0221689240e-combined-ca-bundle\") pod \"c8deccf1-954a-4b62-bb58-b0221689240e\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.371028 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8deccf1-954a-4b62-bb58-b0221689240e-logs\") pod \"c8deccf1-954a-4b62-bb58-b0221689240e\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.371089 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt2bb\" (UniqueName: \"kubernetes.io/projected/c8deccf1-954a-4b62-bb58-b0221689240e-kube-api-access-qt2bb\") pod \"c8deccf1-954a-4b62-bb58-b0221689240e\" (UID: \"c8deccf1-954a-4b62-bb58-b0221689240e\") " Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.372500 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8deccf1-954a-4b62-bb58-b0221689240e-logs" (OuterVolumeSpecName: "logs") pod "c8deccf1-954a-4b62-bb58-b0221689240e" (UID: "c8deccf1-954a-4b62-bb58-b0221689240e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.396426 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8deccf1-954a-4b62-bb58-b0221689240e-kube-api-access-qt2bb" (OuterVolumeSpecName: "kube-api-access-qt2bb") pod "c8deccf1-954a-4b62-bb58-b0221689240e" (UID: "c8deccf1-954a-4b62-bb58-b0221689240e"). InnerVolumeSpecName "kube-api-access-qt2bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.429202 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8deccf1-954a-4b62-bb58-b0221689240e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8deccf1-954a-4b62-bb58-b0221689240e" (UID: "c8deccf1-954a-4b62-bb58-b0221689240e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.440527 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8deccf1-954a-4b62-bb58-b0221689240e-config-data" (OuterVolumeSpecName: "config-data") pod "c8deccf1-954a-4b62-bb58-b0221689240e" (UID: "c8deccf1-954a-4b62-bb58-b0221689240e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.473180 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8deccf1-954a-4b62-bb58-b0221689240e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.473224 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8deccf1-954a-4b62-bb58-b0221689240e-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.473234 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt2bb\" (UniqueName: \"kubernetes.io/projected/c8deccf1-954a-4b62-bb58-b0221689240e-kube-api-access-qt2bb\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.473246 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8deccf1-954a-4b62-bb58-b0221689240e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.494989 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.510367 4749 generic.go:334] "Generic (PLEG): container finished" podID="c8deccf1-954a-4b62-bb58-b0221689240e" containerID="7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614" exitCode=0 Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.510393 4749 generic.go:334] "Generic (PLEG): container finished" podID="c8deccf1-954a-4b62-bb58-b0221689240e" containerID="805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410" exitCode=143 Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.510432 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c8deccf1-954a-4b62-bb58-b0221689240e","Type":"ContainerDied","Data":"7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614"} Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.510456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c8deccf1-954a-4b62-bb58-b0221689240e","Type":"ContainerDied","Data":"805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410"} Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.510465 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c8deccf1-954a-4b62-bb58-b0221689240e","Type":"ContainerDied","Data":"c3955b1d8765646f0a91f20ea9980ca8dd8fe7326a77fb6de475b12b449b685a"} Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.510481 4749 scope.go:117] "RemoveContainer" containerID="7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.510598 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.526631 4749 generic.go:334] "Generic (PLEG): container finished" podID="e2054662-5786-4a04-a7c9-16fe32a04610" containerID="2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090" exitCode=2 Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.527594 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.527723 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e2054662-5786-4a04-a7c9-16fe32a04610","Type":"ContainerDied","Data":"2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090"} Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.527745 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e2054662-5786-4a04-a7c9-16fe32a04610","Type":"ContainerDied","Data":"b79164ed521ff520a0259cb9e0e955d30292cd01eeed3003bcd4980acce52cdd"} Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.596163 4749 scope.go:117] "RemoveContainer" containerID="805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.667495 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.681682 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwd6\" (UniqueName: \"kubernetes.io/projected/e2054662-5786-4a04-a7c9-16fe32a04610-kube-api-access-nzwd6\") pod \"e2054662-5786-4a04-a7c9-16fe32a04610\" (UID: \"e2054662-5786-4a04-a7c9-16fe32a04610\") " Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.683712 4749 scope.go:117] "RemoveContainer" containerID="7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.683834 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:27:53 crc kubenswrapper[4749]: E1001 13:27:53.688151 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614\": container with ID starting with 7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614 not found: ID does not exist" containerID="7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.688202 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614"} err="failed to get container status \"7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614\": rpc error: code = NotFound desc = could not find container \"7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614\": container with ID starting with 7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614 not found: ID does not exist" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.688268 4749 scope.go:117] "RemoveContainer" containerID="805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410" Oct 01 13:27:53 crc kubenswrapper[4749]: E1001 13:27:53.691736 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410\": container with ID starting with 805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410 not found: ID does not exist" containerID="805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.691762 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410"} err="failed to get container status \"805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410\": rpc error: code = NotFound desc = could not find container \"805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410\": container with ID starting with 805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410 not found: ID does not exist" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.691779 4749 scope.go:117] "RemoveContainer" containerID="7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.691830 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:27:53 crc kubenswrapper[4749]: E1001 13:27:53.692255 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2054662-5786-4a04-a7c9-16fe32a04610" containerName="kube-state-metrics" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.692270 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2054662-5786-4a04-a7c9-16fe32a04610" containerName="kube-state-metrics" Oct 01 13:27:53 crc kubenswrapper[4749]: E1001 13:27:53.692297 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8deccf1-954a-4b62-bb58-b0221689240e" containerName="nova-metadata-log" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.692305 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8deccf1-954a-4b62-bb58-b0221689240e" containerName="nova-metadata-log" Oct 01 13:27:53 crc kubenswrapper[4749]: E1001 13:27:53.692322 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8deccf1-954a-4b62-bb58-b0221689240e" containerName="nova-metadata-metadata" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.692329 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8deccf1-954a-4b62-bb58-b0221689240e" containerName="nova-metadata-metadata" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.692551 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8deccf1-954a-4b62-bb58-b0221689240e" containerName="nova-metadata-metadata" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.692567 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2054662-5786-4a04-a7c9-16fe32a04610" containerName="kube-state-metrics" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.692577 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8deccf1-954a-4b62-bb58-b0221689240e" containerName="nova-metadata-log" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.693123 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614"} err="failed to get container status \"7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614\": rpc error: code = NotFound desc = could not find container \"7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614\": container with ID starting with 7fd850d8a5f56af924ca3628b208a1db58e3fe76acfe05ee3df91c3f1c09f614 not found: ID does not exist" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.693169 4749 scope.go:117] "RemoveContainer" containerID="805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.693681 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.698077 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.699036 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2054662-5786-4a04-a7c9-16fe32a04610-kube-api-access-nzwd6" (OuterVolumeSpecName: "kube-api-access-nzwd6") pod "e2054662-5786-4a04-a7c9-16fe32a04610" (UID: "e2054662-5786-4a04-a7c9-16fe32a04610"). InnerVolumeSpecName "kube-api-access-nzwd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.700602 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410"} err="failed to get container status \"805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410\": rpc error: code = NotFound desc = could not find container \"805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410\": container with ID starting with 805e8c58263f8f3c68cea217b2a2154826ed371c4426db63a41ea0be701e9410 not found: ID does not exist" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.700647 4749 scope.go:117] "RemoveContainer" containerID="2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.704807 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.704856 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.768634 4749 scope.go:117] "RemoveContainer" containerID="2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090" Oct 01 13:27:53 crc kubenswrapper[4749]: E1001 13:27:53.769556 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090\": container with ID starting with 2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090 not found: ID does not exist" containerID="2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.769594 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090"} err="failed to get container status \"2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090\": rpc error: code = NotFound desc = could not find container \"2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090\": container with ID starting with 2c3f9dd05e887708feca7e1b5f68daf4bbf8ee6219d4c651b386c2a4b6e5b090 not found: ID does not exist" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.783744 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwd6\" (UniqueName: \"kubernetes.io/projected/e2054662-5786-4a04-a7c9-16fe32a04610-kube-api-access-nzwd6\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.859283 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.866902 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.882889 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.884026 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.885611 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-logs\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.885647 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.885728 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.885799 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.885811 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5bhg\" (UniqueName: \"kubernetes.io/projected/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-kube-api-access-t5bhg\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.885841 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-config-data\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.886169 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.897702 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.987267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4b21eff6-e2ad-4c02-9558-0346ff822f46-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4b21eff6-e2ad-4c02-9558-0346ff822f46\") " pod="openstack/kube-state-metrics-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.987422 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.987454 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b21eff6-e2ad-4c02-9558-0346ff822f46-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4b21eff6-e2ad-4c02-9558-0346ff822f46\") " pod="openstack/kube-state-metrics-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.987525 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21eff6-e2ad-4c02-9558-0346ff822f46-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4b21eff6-e2ad-4c02-9558-0346ff822f46\") " pod="openstack/kube-state-metrics-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.987685 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5bhg\" (UniqueName: \"kubernetes.io/projected/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-kube-api-access-t5bhg\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.987789 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-config-data\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.987824 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-logs\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.987861 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvdgv\" (UniqueName: \"kubernetes.io/projected/4b21eff6-e2ad-4c02-9558-0346ff822f46-kube-api-access-zvdgv\") pod \"kube-state-metrics-0\" (UID: \"4b21eff6-e2ad-4c02-9558-0346ff822f46\") " pod="openstack/kube-state-metrics-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.987897 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.988460 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-logs\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.991097 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:53 crc kubenswrapper[4749]: I1001 13:27:53.999855 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-config-data\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.000766 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.003907 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5bhg\" (UniqueName: \"kubernetes.io/projected/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-kube-api-access-t5bhg\") pod \"nova-metadata-0\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " pod="openstack/nova-metadata-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.051998 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.089424 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b21eff6-e2ad-4c02-9558-0346ff822f46-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4b21eff6-e2ad-4c02-9558-0346ff822f46\") " pod="openstack/kube-state-metrics-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.089703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21eff6-e2ad-4c02-9558-0346ff822f46-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4b21eff6-e2ad-4c02-9558-0346ff822f46\") " pod="openstack/kube-state-metrics-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.089841 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdgv\" (UniqueName: \"kubernetes.io/projected/4b21eff6-e2ad-4c02-9558-0346ff822f46-kube-api-access-zvdgv\") pod \"kube-state-metrics-0\" (UID: \"4b21eff6-e2ad-4c02-9558-0346ff822f46\") " pod="openstack/kube-state-metrics-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.089970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4b21eff6-e2ad-4c02-9558-0346ff822f46-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4b21eff6-e2ad-4c02-9558-0346ff822f46\") " pod="openstack/kube-state-metrics-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.093525 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b21eff6-e2ad-4c02-9558-0346ff822f46-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4b21eff6-e2ad-4c02-9558-0346ff822f46\") " pod="openstack/kube-state-metrics-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.094024 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4b21eff6-e2ad-4c02-9558-0346ff822f46-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4b21eff6-e2ad-4c02-9558-0346ff822f46\") " pod="openstack/kube-state-metrics-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.097805 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b21eff6-e2ad-4c02-9558-0346ff822f46-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4b21eff6-e2ad-4c02-9558-0346ff822f46\") " pod="openstack/kube-state-metrics-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.106555 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvdgv\" (UniqueName: \"kubernetes.io/projected/4b21eff6-e2ad-4c02-9558-0346ff822f46-kube-api-access-zvdgv\") pod \"kube-state-metrics-0\" (UID: \"4b21eff6-e2ad-4c02-9558-0346ff822f46\") " pod="openstack/kube-state-metrics-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.207271 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.535455 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:27:54 crc kubenswrapper[4749]: I1001 13:27:54.665026 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.256472 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8deccf1-954a-4b62-bb58-b0221689240e" path="/var/lib/kubelet/pods/c8deccf1-954a-4b62-bb58-b0221689240e/volumes" Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.257257 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2054662-5786-4a04-a7c9-16fe32a04610" path="/var/lib/kubelet/pods/e2054662-5786-4a04-a7c9-16fe32a04610/volumes" Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.559901 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0327c5c1-8a89-488e-aaaa-36e4c43fe73e","Type":"ContainerStarted","Data":"96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899"} Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.560288 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0327c5c1-8a89-488e-aaaa-36e4c43fe73e","Type":"ContainerStarted","Data":"59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33"} Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.560306 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0327c5c1-8a89-488e-aaaa-36e4c43fe73e","Type":"ContainerStarted","Data":"df7bf36cb3054bc9c296e1f285729996d1bc78d72b4db78895536d50821c795a"} Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.563180 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4b21eff6-e2ad-4c02-9558-0346ff822f46","Type":"ContainerStarted","Data":"67f8f38f3ca8e77c064538f19cc0f6dbc7aeb33988dac40dde837f14da3df862"} Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.563235 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4b21eff6-e2ad-4c02-9558-0346ff822f46","Type":"ContainerStarted","Data":"ab094843447d6f66cf26f08eb7b8daca127cb77ba05c71e615b0b20a770354cc"} Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.563843 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.591008 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.590983574 podStartE2EDuration="2.590983574s" podCreationTimestamp="2025-10-01 13:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:27:55.579094521 +0000 UTC m=+1335.633079430" watchObservedRunningTime="2025-10-01 13:27:55.590983574 +0000 UTC m=+1335.644968483" Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.810467 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.443967288 podStartE2EDuration="2.810435735s" podCreationTimestamp="2025-10-01 13:27:53 +0000 UTC" firstStartedPulling="2025-10-01 13:27:54.677571614 +0000 UTC m=+1334.731556513" lastFinishedPulling="2025-10-01 13:27:55.044040051 +0000 UTC m=+1335.098024960" observedRunningTime="2025-10-01 13:27:55.610524952 +0000 UTC m=+1335.664509861" watchObservedRunningTime="2025-10-01 13:27:55.810435735 +0000 UTC m=+1335.864420664" Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.814135 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.814548 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="ceilometer-central-agent" containerID="cri-o://5ac823d6a17967ad62e81f6352f956bc3f3ae0ec1b539bf76cd6533923a16b8d" gracePeriod=30 Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.814645 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="sg-core" containerID="cri-o://63038e8b4c7657fa884dc3d46ca1ada1c7c8a35c97bcf88c126efcf6c5f02c57" gracePeriod=30 Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.814783 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="proxy-httpd" containerID="cri-o://2ccd41c2c96f472955ca03754b9527ab834fc368b5d0f02db355abaa8c24fb9e" gracePeriod=30 Oct 01 13:27:55 crc kubenswrapper[4749]: I1001 13:27:55.814840 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="ceilometer-notification-agent" containerID="cri-o://440441178e80869539f16ee0b9e3dc35beeebb8ce05fb21f5cf5248df128a6c3" gracePeriod=30 Oct 01 13:27:56 crc kubenswrapper[4749]: I1001 13:27:56.580688 4749 generic.go:334] "Generic (PLEG): container finished" podID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerID="2ccd41c2c96f472955ca03754b9527ab834fc368b5d0f02db355abaa8c24fb9e" exitCode=0 Oct 01 13:27:56 crc kubenswrapper[4749]: I1001 13:27:56.581026 4749 generic.go:334] "Generic (PLEG): container finished" podID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerID="63038e8b4c7657fa884dc3d46ca1ada1c7c8a35c97bcf88c126efcf6c5f02c57" exitCode=2 Oct 01 13:27:56 crc kubenswrapper[4749]: I1001 13:27:56.581036 4749 generic.go:334] "Generic (PLEG): container finished" podID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerID="5ac823d6a17967ad62e81f6352f956bc3f3ae0ec1b539bf76cd6533923a16b8d" exitCode=0 Oct 01 13:27:56 crc kubenswrapper[4749]: I1001 13:27:56.580922 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"729e9ae9-3bf2-4e33-95e4-33f96da93660","Type":"ContainerDied","Data":"2ccd41c2c96f472955ca03754b9527ab834fc368b5d0f02db355abaa8c24fb9e"} Oct 01 13:27:56 crc kubenswrapper[4749]: I1001 13:27:56.581107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"729e9ae9-3bf2-4e33-95e4-33f96da93660","Type":"ContainerDied","Data":"63038e8b4c7657fa884dc3d46ca1ada1c7c8a35c97bcf88c126efcf6c5f02c57"} Oct 01 13:27:56 crc kubenswrapper[4749]: I1001 13:27:56.581139 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"729e9ae9-3bf2-4e33-95e4-33f96da93660","Type":"ContainerDied","Data":"5ac823d6a17967ad62e81f6352f956bc3f3ae0ec1b539bf76cd6533923a16b8d"} Oct 01 13:27:56 crc kubenswrapper[4749]: I1001 13:27:56.773689 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:27:56 crc kubenswrapper[4749]: I1001 13:27:56.773743 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:27:56 crc kubenswrapper[4749]: I1001 13:27:56.820669 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.177746 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.177786 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.193649 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.225185 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.282580 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cccb8cfc9-7lmlc"] Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.282845 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" podUID="0585bc19-be35-4666-8882-9f8332fd362d" containerName="dnsmasq-dns" containerID="cri-o://362dfa8dcb380fc0c25a0fc3ec487416da72f8450c39a2e3b46abbc8f25f9aac" gracePeriod=10 Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.591243 4749 generic.go:334] "Generic (PLEG): container finished" podID="0585bc19-be35-4666-8882-9f8332fd362d" containerID="362dfa8dcb380fc0c25a0fc3ec487416da72f8450c39a2e3b46abbc8f25f9aac" exitCode=0 Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.591471 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" event={"ID":"0585bc19-be35-4666-8882-9f8332fd362d","Type":"ContainerDied","Data":"362dfa8dcb380fc0c25a0fc3ec487416da72f8450c39a2e3b46abbc8f25f9aac"} Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.596021 4749 generic.go:334] "Generic (PLEG): container finished" podID="0dcfec60-454f-4182-b53d-280d182dee40" containerID="369d9fd8645d6a719de992e7103cc6537fde49bf3dcf4f69544d82d1b2978f28" exitCode=0 Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.597117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tkdsq" event={"ID":"0dcfec60-454f-4182-b53d-280d182dee40","Type":"ContainerDied","Data":"369d9fd8645d6a719de992e7103cc6537fde49bf3dcf4f69544d82d1b2978f28"} Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.661057 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.833705 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.856418 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8a209604-f49d-4844-a505-c3cb9202fb13" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.856834 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8a209604-f49d-4844-a505-c3cb9202fb13" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.909175 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp2p8\" (UniqueName: \"kubernetes.io/projected/0585bc19-be35-4666-8882-9f8332fd362d-kube-api-access-qp2p8\") pod \"0585bc19-be35-4666-8882-9f8332fd362d\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.909283 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-config\") pod \"0585bc19-be35-4666-8882-9f8332fd362d\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.909423 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-ovsdbserver-sb\") pod \"0585bc19-be35-4666-8882-9f8332fd362d\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.909455 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-ovsdbserver-nb\") pod \"0585bc19-be35-4666-8882-9f8332fd362d\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.909478 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-dns-svc\") pod \"0585bc19-be35-4666-8882-9f8332fd362d\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.909504 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-dns-swift-storage-0\") pod \"0585bc19-be35-4666-8882-9f8332fd362d\" (UID: \"0585bc19-be35-4666-8882-9f8332fd362d\") " Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.916440 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0585bc19-be35-4666-8882-9f8332fd362d-kube-api-access-qp2p8" (OuterVolumeSpecName: "kube-api-access-qp2p8") pod "0585bc19-be35-4666-8882-9f8332fd362d" (UID: "0585bc19-be35-4666-8882-9f8332fd362d"). InnerVolumeSpecName "kube-api-access-qp2p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.967874 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-config" (OuterVolumeSpecName: "config") pod "0585bc19-be35-4666-8882-9f8332fd362d" (UID: "0585bc19-be35-4666-8882-9f8332fd362d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.973848 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0585bc19-be35-4666-8882-9f8332fd362d" (UID: "0585bc19-be35-4666-8882-9f8332fd362d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.975260 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0585bc19-be35-4666-8882-9f8332fd362d" (UID: "0585bc19-be35-4666-8882-9f8332fd362d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.977585 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0585bc19-be35-4666-8882-9f8332fd362d" (UID: "0585bc19-be35-4666-8882-9f8332fd362d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:27:57 crc kubenswrapper[4749]: I1001 13:27:57.999747 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0585bc19-be35-4666-8882-9f8332fd362d" (UID: "0585bc19-be35-4666-8882-9f8332fd362d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:27:58 crc kubenswrapper[4749]: I1001 13:27:58.012058 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:58 crc kubenswrapper[4749]: I1001 13:27:58.012092 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:58 crc kubenswrapper[4749]: I1001 13:27:58.012102 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:58 crc kubenswrapper[4749]: I1001 13:27:58.012111 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:58 crc kubenswrapper[4749]: I1001 13:27:58.012120 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp2p8\" (UniqueName: \"kubernetes.io/projected/0585bc19-be35-4666-8882-9f8332fd362d-kube-api-access-qp2p8\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:58 crc kubenswrapper[4749]: I1001 13:27:58.012129 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0585bc19-be35-4666-8882-9f8332fd362d-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:58 crc kubenswrapper[4749]: I1001 13:27:58.607898 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" Oct 01 13:27:58 crc kubenswrapper[4749]: I1001 13:27:58.607970 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cccb8cfc9-7lmlc" event={"ID":"0585bc19-be35-4666-8882-9f8332fd362d","Type":"ContainerDied","Data":"8f928912266c1f8c468dae2c167462a53981b874c51d27eb6a090efe73a16fd1"} Oct 01 13:27:58 crc kubenswrapper[4749]: I1001 13:27:58.608801 4749 scope.go:117] "RemoveContainer" containerID="362dfa8dcb380fc0c25a0fc3ec487416da72f8450c39a2e3b46abbc8f25f9aac" Oct 01 13:27:58 crc kubenswrapper[4749]: I1001 13:27:58.654493 4749 scope.go:117] "RemoveContainer" containerID="1ad06e42f284f5cd11736595c78ac06024aad7c092b91f7f2c6c39b9ecba5933" Oct 01 13:27:58 crc kubenswrapper[4749]: I1001 13:27:58.655389 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cccb8cfc9-7lmlc"] Oct 01 13:27:58 crc kubenswrapper[4749]: I1001 13:27:58.667148 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cccb8cfc9-7lmlc"] Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.047318 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.052134 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.052240 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.143577 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-config-data\") pod \"0dcfec60-454f-4182-b53d-280d182dee40\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.143931 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-scripts\") pod \"0dcfec60-454f-4182-b53d-280d182dee40\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.143994 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcfml\" (UniqueName: \"kubernetes.io/projected/0dcfec60-454f-4182-b53d-280d182dee40-kube-api-access-gcfml\") pod \"0dcfec60-454f-4182-b53d-280d182dee40\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.144043 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-combined-ca-bundle\") pod \"0dcfec60-454f-4182-b53d-280d182dee40\" (UID: \"0dcfec60-454f-4182-b53d-280d182dee40\") " Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.165427 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dcfec60-454f-4182-b53d-280d182dee40-kube-api-access-gcfml" (OuterVolumeSpecName: "kube-api-access-gcfml") pod "0dcfec60-454f-4182-b53d-280d182dee40" (UID: "0dcfec60-454f-4182-b53d-280d182dee40"). InnerVolumeSpecName "kube-api-access-gcfml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.171357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-scripts" (OuterVolumeSpecName: "scripts") pod "0dcfec60-454f-4182-b53d-280d182dee40" (UID: "0dcfec60-454f-4182-b53d-280d182dee40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.182462 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-config-data" (OuterVolumeSpecName: "config-data") pod "0dcfec60-454f-4182-b53d-280d182dee40" (UID: "0dcfec60-454f-4182-b53d-280d182dee40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.187950 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dcfec60-454f-4182-b53d-280d182dee40" (UID: "0dcfec60-454f-4182-b53d-280d182dee40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.257997 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.258049 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.258067 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcfml\" (UniqueName: \"kubernetes.io/projected/0dcfec60-454f-4182-b53d-280d182dee40-kube-api-access-gcfml\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.258091 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dcfec60-454f-4182-b53d-280d182dee40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.277505 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0585bc19-be35-4666-8882-9f8332fd362d" path="/var/lib/kubelet/pods/0585bc19-be35-4666-8882-9f8332fd362d/volumes" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.620197 4749 generic.go:334] "Generic (PLEG): container finished" podID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerID="440441178e80869539f16ee0b9e3dc35beeebb8ce05fb21f5cf5248df128a6c3" exitCode=0 Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.620272 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"729e9ae9-3bf2-4e33-95e4-33f96da93660","Type":"ContainerDied","Data":"440441178e80869539f16ee0b9e3dc35beeebb8ce05fb21f5cf5248df128a6c3"} Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.634143 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-tkdsq" event={"ID":"0dcfec60-454f-4182-b53d-280d182dee40","Type":"ContainerDied","Data":"ab9fb7441fe2767d5ddf38208bd44080d6d7f8ec8b780b6d91e2bc584df81bcb"} Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.634188 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab9fb7441fe2767d5ddf38208bd44080d6d7f8ec8b780b6d91e2bc584df81bcb" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.634186 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-tkdsq" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.778679 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.810721 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.811060 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="79c6d349-9c67-41bf-b71c-8a1ce73e765e" containerName="nova-scheduler-scheduler" containerID="cri-o://723d9046411404939ef327a949756f01e2d9f84315b59d8d17355d1b64f598e9" gracePeriod=30 Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.823279 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.823492 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8a209604-f49d-4844-a505-c3cb9202fb13" containerName="nova-api-log" containerID="cri-o://269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84" gracePeriod=30 Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.823927 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8a209604-f49d-4844-a505-c3cb9202fb13" containerName="nova-api-api" containerID="cri-o://5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858" gracePeriod=30 Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.868324 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fxzj\" (UniqueName: \"kubernetes.io/projected/729e9ae9-3bf2-4e33-95e4-33f96da93660-kube-api-access-5fxzj\") pod \"729e9ae9-3bf2-4e33-95e4-33f96da93660\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.868456 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-combined-ca-bundle\") pod \"729e9ae9-3bf2-4e33-95e4-33f96da93660\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.868494 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-config-data\") pod \"729e9ae9-3bf2-4e33-95e4-33f96da93660\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.868584 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/729e9ae9-3bf2-4e33-95e4-33f96da93660-run-httpd\") pod \"729e9ae9-3bf2-4e33-95e4-33f96da93660\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.868611 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-sg-core-conf-yaml\") pod \"729e9ae9-3bf2-4e33-95e4-33f96da93660\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.868669 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-scripts\") pod \"729e9ae9-3bf2-4e33-95e4-33f96da93660\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.868715 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/729e9ae9-3bf2-4e33-95e4-33f96da93660-log-httpd\") pod \"729e9ae9-3bf2-4e33-95e4-33f96da93660\" (UID: \"729e9ae9-3bf2-4e33-95e4-33f96da93660\") " Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.869779 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/729e9ae9-3bf2-4e33-95e4-33f96da93660-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "729e9ae9-3bf2-4e33-95e4-33f96da93660" (UID: "729e9ae9-3bf2-4e33-95e4-33f96da93660"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.870566 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/729e9ae9-3bf2-4e33-95e4-33f96da93660-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "729e9ae9-3bf2-4e33-95e4-33f96da93660" (UID: "729e9ae9-3bf2-4e33-95e4-33f96da93660"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.878086 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/729e9ae9-3bf2-4e33-95e4-33f96da93660-kube-api-access-5fxzj" (OuterVolumeSpecName: "kube-api-access-5fxzj") pod "729e9ae9-3bf2-4e33-95e4-33f96da93660" (UID: "729e9ae9-3bf2-4e33-95e4-33f96da93660"). InnerVolumeSpecName "kube-api-access-5fxzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.897810 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-scripts" (OuterVolumeSpecName: "scripts") pod "729e9ae9-3bf2-4e33-95e4-33f96da93660" (UID: "729e9ae9-3bf2-4e33-95e4-33f96da93660"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.908902 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.955526 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "729e9ae9-3bf2-4e33-95e4-33f96da93660" (UID: "729e9ae9-3bf2-4e33-95e4-33f96da93660"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.974492 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fxzj\" (UniqueName: \"kubernetes.io/projected/729e9ae9-3bf2-4e33-95e4-33f96da93660-kube-api-access-5fxzj\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.974527 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/729e9ae9-3bf2-4e33-95e4-33f96da93660-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.974538 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.974545 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.974554 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/729e9ae9-3bf2-4e33-95e4-33f96da93660-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:27:59 crc kubenswrapper[4749]: I1001 13:27:59.989482 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "729e9ae9-3bf2-4e33-95e4-33f96da93660" (UID: "729e9ae9-3bf2-4e33-95e4-33f96da93660"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.022297 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-config-data" (OuterVolumeSpecName: "config-data") pod "729e9ae9-3bf2-4e33-95e4-33f96da93660" (UID: "729e9ae9-3bf2-4e33-95e4-33f96da93660"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.076294 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.076767 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/729e9ae9-3bf2-4e33-95e4-33f96da93660-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.658700 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"729e9ae9-3bf2-4e33-95e4-33f96da93660","Type":"ContainerDied","Data":"2a1126e0e98ad9c97d4ded14134109d41feeed7d89fab8b646a827a3c0fc8b18"} Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.658886 4749 scope.go:117] "RemoveContainer" containerID="2ccd41c2c96f472955ca03754b9527ab834fc368b5d0f02db355abaa8c24fb9e" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.659257 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.673968 4749 generic.go:334] "Generic (PLEG): container finished" podID="8a209604-f49d-4844-a505-c3cb9202fb13" containerID="269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84" exitCode=143 Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.674125 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a209604-f49d-4844-a505-c3cb9202fb13","Type":"ContainerDied","Data":"269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84"} Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.674246 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0327c5c1-8a89-488e-aaaa-36e4c43fe73e" containerName="nova-metadata-metadata" containerID="cri-o://96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899" gracePeriod=30 Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.674186 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0327c5c1-8a89-488e-aaaa-36e4c43fe73e" containerName="nova-metadata-log" containerID="cri-o://59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33" gracePeriod=30 Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.704549 4749 scope.go:117] "RemoveContainer" containerID="63038e8b4c7657fa884dc3d46ca1ada1c7c8a35c97bcf88c126efcf6c5f02c57" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.717747 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.729944 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.751925 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:28:00 crc kubenswrapper[4749]: E1001 13:28:00.752306 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0585bc19-be35-4666-8882-9f8332fd362d" containerName="dnsmasq-dns" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752318 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0585bc19-be35-4666-8882-9f8332fd362d" containerName="dnsmasq-dns" Oct 01 13:28:00 crc kubenswrapper[4749]: E1001 13:28:00.752326 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="proxy-httpd" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752332 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="proxy-httpd" Oct 01 13:28:00 crc kubenswrapper[4749]: E1001 13:28:00.752363 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="ceilometer-central-agent" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752369 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="ceilometer-central-agent" Oct 01 13:28:00 crc kubenswrapper[4749]: E1001 13:28:00.752382 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0585bc19-be35-4666-8882-9f8332fd362d" containerName="init" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752389 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0585bc19-be35-4666-8882-9f8332fd362d" containerName="init" Oct 01 13:28:00 crc kubenswrapper[4749]: E1001 13:28:00.752395 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="sg-core" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752400 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="sg-core" Oct 01 13:28:00 crc kubenswrapper[4749]: E1001 13:28:00.752416 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="ceilometer-notification-agent" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752423 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="ceilometer-notification-agent" Oct 01 13:28:00 crc kubenswrapper[4749]: E1001 13:28:00.752433 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dcfec60-454f-4182-b53d-280d182dee40" containerName="nova-manage" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752438 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dcfec60-454f-4182-b53d-280d182dee40" containerName="nova-manage" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752620 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="ceilometer-central-agent" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752633 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dcfec60-454f-4182-b53d-280d182dee40" containerName="nova-manage" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752647 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="ceilometer-notification-agent" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752661 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0585bc19-be35-4666-8882-9f8332fd362d" containerName="dnsmasq-dns" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752676 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="proxy-httpd" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.752686 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" containerName="sg-core" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.756697 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.760280 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.760567 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.765716 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.770000 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.774401 4749 scope.go:117] "RemoveContainer" containerID="440441178e80869539f16ee0b9e3dc35beeebb8ce05fb21f5cf5248df128a6c3" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.792641 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp7c2\" (UniqueName: \"kubernetes.io/projected/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-kube-api-access-lp7c2\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.792729 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-log-httpd\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.792898 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-scripts\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.792950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-run-httpd\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.793036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.793110 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-config-data\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.793159 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.793309 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.851818 4749 scope.go:117] "RemoveContainer" containerID="5ac823d6a17967ad62e81f6352f956bc3f3ae0ec1b539bf76cd6533923a16b8d" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.896280 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.896380 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.896426 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp7c2\" (UniqueName: \"kubernetes.io/projected/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-kube-api-access-lp7c2\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.896459 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-log-httpd\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.896537 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-scripts\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.896563 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-run-httpd\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.896608 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.896641 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-config-data\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.898300 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-log-httpd\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.900423 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-run-httpd\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.900866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-config-data\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.901281 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.913873 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.915628 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-scripts\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.916835 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp7c2\" (UniqueName: \"kubernetes.io/projected/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-kube-api-access-lp7c2\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:00 crc kubenswrapper[4749]: I1001 13:28:00.918004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " pod="openstack/ceilometer-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.084163 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.189729 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.251251 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="729e9ae9-3bf2-4e33-95e4-33f96da93660" path="/var/lib/kubelet/pods/729e9ae9-3bf2-4e33-95e4-33f96da93660/volumes" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.303022 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5bhg\" (UniqueName: \"kubernetes.io/projected/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-kube-api-access-t5bhg\") pod \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.303171 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-config-data\") pod \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.303239 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-logs\") pod \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.303318 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-combined-ca-bundle\") pod \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.303436 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-nova-metadata-tls-certs\") pod \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\" (UID: \"0327c5c1-8a89-488e-aaaa-36e4c43fe73e\") " Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.305024 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-logs" (OuterVolumeSpecName: "logs") pod "0327c5c1-8a89-488e-aaaa-36e4c43fe73e" (UID: "0327c5c1-8a89-488e-aaaa-36e4c43fe73e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.309867 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-kube-api-access-t5bhg" (OuterVolumeSpecName: "kube-api-access-t5bhg") pod "0327c5c1-8a89-488e-aaaa-36e4c43fe73e" (UID: "0327c5c1-8a89-488e-aaaa-36e4c43fe73e"). InnerVolumeSpecName "kube-api-access-t5bhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.344334 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-config-data" (OuterVolumeSpecName: "config-data") pod "0327c5c1-8a89-488e-aaaa-36e4c43fe73e" (UID: "0327c5c1-8a89-488e-aaaa-36e4c43fe73e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.347385 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0327c5c1-8a89-488e-aaaa-36e4c43fe73e" (UID: "0327c5c1-8a89-488e-aaaa-36e4c43fe73e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.361716 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0327c5c1-8a89-488e-aaaa-36e4c43fe73e" (UID: "0327c5c1-8a89-488e-aaaa-36e4c43fe73e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.381984 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.408029 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5bhg\" (UniqueName: \"kubernetes.io/projected/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-kube-api-access-t5bhg\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.408085 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.408100 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.408110 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.408121 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0327c5c1-8a89-488e-aaaa-36e4c43fe73e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.509537 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a209604-f49d-4844-a505-c3cb9202fb13-logs\") pod \"8a209604-f49d-4844-a505-c3cb9202fb13\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.509588 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2khqk\" (UniqueName: \"kubernetes.io/projected/8a209604-f49d-4844-a505-c3cb9202fb13-kube-api-access-2khqk\") pod \"8a209604-f49d-4844-a505-c3cb9202fb13\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.509682 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a209604-f49d-4844-a505-c3cb9202fb13-combined-ca-bundle\") pod \"8a209604-f49d-4844-a505-c3cb9202fb13\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.509752 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a209604-f49d-4844-a505-c3cb9202fb13-config-data\") pod \"8a209604-f49d-4844-a505-c3cb9202fb13\" (UID: \"8a209604-f49d-4844-a505-c3cb9202fb13\") " Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.509977 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a209604-f49d-4844-a505-c3cb9202fb13-logs" (OuterVolumeSpecName: "logs") pod "8a209604-f49d-4844-a505-c3cb9202fb13" (UID: "8a209604-f49d-4844-a505-c3cb9202fb13"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.510232 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a209604-f49d-4844-a505-c3cb9202fb13-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.517475 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a209604-f49d-4844-a505-c3cb9202fb13-kube-api-access-2khqk" (OuterVolumeSpecName: "kube-api-access-2khqk") pod "8a209604-f49d-4844-a505-c3cb9202fb13" (UID: "8a209604-f49d-4844-a505-c3cb9202fb13"). InnerVolumeSpecName "kube-api-access-2khqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.538239 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a209604-f49d-4844-a505-c3cb9202fb13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a209604-f49d-4844-a505-c3cb9202fb13" (UID: "8a209604-f49d-4844-a505-c3cb9202fb13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.538868 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a209604-f49d-4844-a505-c3cb9202fb13-config-data" (OuterVolumeSpecName: "config-data") pod "8a209604-f49d-4844-a505-c3cb9202fb13" (UID: "8a209604-f49d-4844-a505-c3cb9202fb13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.590827 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.611611 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2khqk\" (UniqueName: \"kubernetes.io/projected/8a209604-f49d-4844-a505-c3cb9202fb13-kube-api-access-2khqk\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.611841 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a209604-f49d-4844-a505-c3cb9202fb13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.611850 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a209604-f49d-4844-a505-c3cb9202fb13-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.685591 4749 generic.go:334] "Generic (PLEG): container finished" podID="8a209604-f49d-4844-a505-c3cb9202fb13" containerID="5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858" exitCode=0 Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.685703 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.685722 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a209604-f49d-4844-a505-c3cb9202fb13","Type":"ContainerDied","Data":"5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858"} Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.685803 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8a209604-f49d-4844-a505-c3cb9202fb13","Type":"ContainerDied","Data":"9ecb5328eb20e397503d960e80734f093f9a85268aaf815bc5c6420f27005b46"} Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.685834 4749 scope.go:117] "RemoveContainer" containerID="5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.689392 4749 generic.go:334] "Generic (PLEG): container finished" podID="0327c5c1-8a89-488e-aaaa-36e4c43fe73e" containerID="96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899" exitCode=0 Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.689414 4749 generic.go:334] "Generic (PLEG): container finished" podID="0327c5c1-8a89-488e-aaaa-36e4c43fe73e" containerID="59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33" exitCode=143 Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.689430 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0327c5c1-8a89-488e-aaaa-36e4c43fe73e","Type":"ContainerDied","Data":"96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899"} Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.689465 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0327c5c1-8a89-488e-aaaa-36e4c43fe73e","Type":"ContainerDied","Data":"59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33"} Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.689478 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0327c5c1-8a89-488e-aaaa-36e4c43fe73e","Type":"ContainerDied","Data":"df7bf36cb3054bc9c296e1f285729996d1bc78d72b4db78895536d50821c795a"} Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.689478 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.696732 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc","Type":"ContainerStarted","Data":"3926e9129045ad063d84b48a8dd558bef04c2fe672b137f0db3f5dbe66e48bd2"} Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.699723 4749 generic.go:334] "Generic (PLEG): container finished" podID="46286372-30ef-489e-8076-a65ad341d010" containerID="bc9e246cd99f270946e29c2281e7926f9b778fd8bcc539e8e67528d2e2d9d7e2" exitCode=0 Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.699791 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5b99w" event={"ID":"46286372-30ef-489e-8076-a65ad341d010","Type":"ContainerDied","Data":"bc9e246cd99f270946e29c2281e7926f9b778fd8bcc539e8e67528d2e2d9d7e2"} Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.732299 4749 scope.go:117] "RemoveContainer" containerID="269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.787401 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.792506 4749 scope.go:117] "RemoveContainer" containerID="5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858" Oct 01 13:28:01 crc kubenswrapper[4749]: E1001 13:28:01.793551 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858\": container with ID starting with 5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858 not found: ID does not exist" containerID="5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.793639 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858"} err="failed to get container status \"5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858\": rpc error: code = NotFound desc = could not find container \"5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858\": container with ID starting with 5f471db9dc203ba862f5820173b813c9ff35670bda98d21c6d26cee14b6c5858 not found: ID does not exist" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.793674 4749 scope.go:117] "RemoveContainer" containerID="269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84" Oct 01 13:28:01 crc kubenswrapper[4749]: E1001 13:28:01.794237 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84\": container with ID starting with 269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84 not found: ID does not exist" containerID="269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.794282 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84"} err="failed to get container status \"269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84\": rpc error: code = NotFound desc = could not find container \"269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84\": container with ID starting with 269b7154b62119f240828d3a7a522760979fe33131679cb14a13ad99d61adb84 not found: ID does not exist" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.794309 4749 scope.go:117] "RemoveContainer" containerID="96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.809505 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.819662 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.829200 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.833308 4749 scope.go:117] "RemoveContainer" containerID="59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.841372 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:01 crc kubenswrapper[4749]: E1001 13:28:01.841805 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0327c5c1-8a89-488e-aaaa-36e4c43fe73e" containerName="nova-metadata-log" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.841821 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0327c5c1-8a89-488e-aaaa-36e4c43fe73e" containerName="nova-metadata-log" Oct 01 13:28:01 crc kubenswrapper[4749]: E1001 13:28:01.841833 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0327c5c1-8a89-488e-aaaa-36e4c43fe73e" containerName="nova-metadata-metadata" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.841839 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0327c5c1-8a89-488e-aaaa-36e4c43fe73e" containerName="nova-metadata-metadata" Oct 01 13:28:01 crc kubenswrapper[4749]: E1001 13:28:01.841868 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a209604-f49d-4844-a505-c3cb9202fb13" containerName="nova-api-log" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.841874 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a209604-f49d-4844-a505-c3cb9202fb13" containerName="nova-api-log" Oct 01 13:28:01 crc kubenswrapper[4749]: E1001 13:28:01.841885 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a209604-f49d-4844-a505-c3cb9202fb13" containerName="nova-api-api" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.841891 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a209604-f49d-4844-a505-c3cb9202fb13" containerName="nova-api-api" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.842066 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a209604-f49d-4844-a505-c3cb9202fb13" containerName="nova-api-api" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.842083 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a209604-f49d-4844-a505-c3cb9202fb13" containerName="nova-api-log" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.842093 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0327c5c1-8a89-488e-aaaa-36e4c43fe73e" containerName="nova-metadata-log" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.842107 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0327c5c1-8a89-488e-aaaa-36e4c43fe73e" containerName="nova-metadata-metadata" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.843275 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.851018 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.847056 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.864372 4749 scope.go:117] "RemoveContainer" containerID="96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899" Oct 01 13:28:01 crc kubenswrapper[4749]: E1001 13:28:01.865696 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899\": container with ID starting with 96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899 not found: ID does not exist" containerID="96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.865738 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899"} err="failed to get container status \"96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899\": rpc error: code = NotFound desc = could not find container \"96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899\": container with ID starting with 96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899 not found: ID does not exist" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.865763 4749 scope.go:117] "RemoveContainer" containerID="59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33" Oct 01 13:28:01 crc kubenswrapper[4749]: E1001 13:28:01.866094 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33\": container with ID starting with 59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33 not found: ID does not exist" containerID="59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.866123 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33"} err="failed to get container status \"59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33\": rpc error: code = NotFound desc = could not find container \"59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33\": container with ID starting with 59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33 not found: ID does not exist" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.866146 4749 scope.go:117] "RemoveContainer" containerID="96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.866472 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899"} err="failed to get container status \"96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899\": rpc error: code = NotFound desc = could not find container \"96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899\": container with ID starting with 96ede8645beba19bdf962109636cbf7e29cdace5a750e8f1374055718f822899 not found: ID does not exist" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.866494 4749 scope.go:117] "RemoveContainer" containerID="59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.866975 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33"} err="failed to get container status \"59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33\": rpc error: code = NotFound desc = could not find container \"59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33\": container with ID starting with 59633363b04d2a1a88517d97bcec7cabfe5b36b908e815be6331d683ec8b8d33 not found: ID does not exist" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.868877 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.870788 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.873833 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.873954 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.882946 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.916507 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rzdg\" (UniqueName: \"kubernetes.io/projected/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-kube-api-access-4rzdg\") pod \"nova-api-0\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " pod="openstack/nova-api-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.916766 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-config-data\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.916854 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " pod="openstack/nova-api-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.916970 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-config-data\") pod \"nova-api-0\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " pod="openstack/nova-api-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.917045 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-logs\") pod \"nova-api-0\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " pod="openstack/nova-api-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.917163 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzdh7\" (UniqueName: \"kubernetes.io/projected/7762644e-6f15-46ba-82dc-de5a50a63f5b-kube-api-access-lzdh7\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.917258 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.917403 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7762644e-6f15-46ba-82dc-de5a50a63f5b-logs\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:01 crc kubenswrapper[4749]: I1001 13:28:01.917491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.018949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-config-data\") pod \"nova-api-0\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " pod="openstack/nova-api-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.019137 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-logs\") pod \"nova-api-0\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " pod="openstack/nova-api-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.019272 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzdh7\" (UniqueName: \"kubernetes.io/projected/7762644e-6f15-46ba-82dc-de5a50a63f5b-kube-api-access-lzdh7\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.019357 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.019505 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7762644e-6f15-46ba-82dc-de5a50a63f5b-logs\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.019583 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.019641 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-logs\") pod \"nova-api-0\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " pod="openstack/nova-api-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.019777 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rzdg\" (UniqueName: \"kubernetes.io/projected/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-kube-api-access-4rzdg\") pod \"nova-api-0\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " pod="openstack/nova-api-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.019908 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-config-data\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.020034 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " pod="openstack/nova-api-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.019901 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7762644e-6f15-46ba-82dc-de5a50a63f5b-logs\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.025394 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-config-data\") pod \"nova-api-0\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " pod="openstack/nova-api-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.027160 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-config-data\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.029994 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.031642 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " pod="openstack/nova-api-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.031709 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.034569 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzdh7\" (UniqueName: \"kubernetes.io/projected/7762644e-6f15-46ba-82dc-de5a50a63f5b-kube-api-access-lzdh7\") pod \"nova-metadata-0\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " pod="openstack/nova-metadata-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.037811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rzdg\" (UniqueName: \"kubernetes.io/projected/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-kube-api-access-4rzdg\") pod \"nova-api-0\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " pod="openstack/nova-api-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.106713 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.107089 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.107137 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.107957 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f591f132880451f6e2a795c1ad995a4e9513c1a6eef56b3898e6a9f77eb8baef"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.108023 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://f591f132880451f6e2a795c1ad995a4e9513c1a6eef56b3898e6a9f77eb8baef" gracePeriod=600 Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.166599 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:28:02 crc kubenswrapper[4749]: E1001 13:28:02.180450 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="723d9046411404939ef327a949756f01e2d9f84315b59d8d17355d1b64f598e9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 13:28:02 crc kubenswrapper[4749]: E1001 13:28:02.181921 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="723d9046411404939ef327a949756f01e2d9f84315b59d8d17355d1b64f598e9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 13:28:02 crc kubenswrapper[4749]: E1001 13:28:02.183437 4749 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="723d9046411404939ef327a949756f01e2d9f84315b59d8d17355d1b64f598e9" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 13:28:02 crc kubenswrapper[4749]: E1001 13:28:02.183474 4749 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="79c6d349-9c67-41bf-b71c-8a1ce73e765e" containerName="nova-scheduler-scheduler" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.187966 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.688110 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.722207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc","Type":"ContainerStarted","Data":"7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1"} Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.722430 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc","Type":"ContainerStarted","Data":"8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6"} Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.724583 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="f591f132880451f6e2a795c1ad995a4e9513c1a6eef56b3898e6a9f77eb8baef" exitCode=0 Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.724655 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"f591f132880451f6e2a795c1ad995a4e9513c1a6eef56b3898e6a9f77eb8baef"} Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.724696 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56"} Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.724722 4749 scope.go:117] "RemoveContainer" containerID="910cd885f70644be5e766281d9b0c0085bea0ad6a3102c2969a829e2725fb191" Oct 01 13:28:02 crc kubenswrapper[4749]: I1001 13:28:02.794694 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:28:02 crc kubenswrapper[4749]: W1001 13:28:02.798166 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7762644e_6f15_46ba_82dc_de5a50a63f5b.slice/crio-fe4d93741d735a3dcaf44cc01df5dc05bae78ba65c01f0fe30f04c59265fb5eb WatchSource:0}: Error finding container fe4d93741d735a3dcaf44cc01df5dc05bae78ba65c01f0fe30f04c59265fb5eb: Status 404 returned error can't find the container with id fe4d93741d735a3dcaf44cc01df5dc05bae78ba65c01f0fe30f04c59265fb5eb Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.039707 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.159592 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clk2b\" (UniqueName: \"kubernetes.io/projected/46286372-30ef-489e-8076-a65ad341d010-kube-api-access-clk2b\") pod \"46286372-30ef-489e-8076-a65ad341d010\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.159703 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-scripts\") pod \"46286372-30ef-489e-8076-a65ad341d010\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.159838 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-config-data\") pod \"46286372-30ef-489e-8076-a65ad341d010\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.159872 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-combined-ca-bundle\") pod \"46286372-30ef-489e-8076-a65ad341d010\" (UID: \"46286372-30ef-489e-8076-a65ad341d010\") " Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.165188 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-scripts" (OuterVolumeSpecName: "scripts") pod "46286372-30ef-489e-8076-a65ad341d010" (UID: "46286372-30ef-489e-8076-a65ad341d010"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.173526 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46286372-30ef-489e-8076-a65ad341d010-kube-api-access-clk2b" (OuterVolumeSpecName: "kube-api-access-clk2b") pod "46286372-30ef-489e-8076-a65ad341d010" (UID: "46286372-30ef-489e-8076-a65ad341d010"). InnerVolumeSpecName "kube-api-access-clk2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.203960 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46286372-30ef-489e-8076-a65ad341d010" (UID: "46286372-30ef-489e-8076-a65ad341d010"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.210314 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-config-data" (OuterVolumeSpecName: "config-data") pod "46286372-30ef-489e-8076-a65ad341d010" (UID: "46286372-30ef-489e-8076-a65ad341d010"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.242467 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0327c5c1-8a89-488e-aaaa-36e4c43fe73e" path="/var/lib/kubelet/pods/0327c5c1-8a89-488e-aaaa-36e4c43fe73e/volumes" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.243704 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a209604-f49d-4844-a505-c3cb9202fb13" path="/var/lib/kubelet/pods/8a209604-f49d-4844-a505-c3cb9202fb13/volumes" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.261601 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.261630 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.261641 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46286372-30ef-489e-8076-a65ad341d010-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.261652 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clk2b\" (UniqueName: \"kubernetes.io/projected/46286372-30ef-489e-8076-a65ad341d010-kube-api-access-clk2b\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.734189 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc","Type":"ContainerStarted","Data":"a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155"} Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.736087 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7762644e-6f15-46ba-82dc-de5a50a63f5b","Type":"ContainerStarted","Data":"1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950"} Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.736136 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7762644e-6f15-46ba-82dc-de5a50a63f5b","Type":"ContainerStarted","Data":"32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c"} Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.736151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7762644e-6f15-46ba-82dc-de5a50a63f5b","Type":"ContainerStarted","Data":"fe4d93741d735a3dcaf44cc01df5dc05bae78ba65c01f0fe30f04c59265fb5eb"} Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.737957 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5b99w" event={"ID":"46286372-30ef-489e-8076-a65ad341d010","Type":"ContainerDied","Data":"f9f9523a2e38d74c886e884ded66fdacecce1bfb6bb065c69d5f4ddf81901111"} Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.737988 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5b99w" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.737990 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9f9523a2e38d74c886e884ded66fdacecce1bfb6bb065c69d5f4ddf81901111" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.745141 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68","Type":"ContainerStarted","Data":"e698349bb482e78848b0954f08cf804bb2012efbd15bb9f4663316a7a52451b9"} Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.745188 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68","Type":"ContainerStarted","Data":"dff2cb79a21e89a17daf55ea1a475794c553ba6165c3ce83afce82e6a46be43e"} Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.745202 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68","Type":"ContainerStarted","Data":"fcb07ecc40026c66e54aba4761661d416ce8bb46fc2530a07d3029092d6d5e6b"} Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.763331 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.763315521 podStartE2EDuration="2.763315521s" podCreationTimestamp="2025-10-01 13:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:28:03.762463456 +0000 UTC m=+1343.816448365" watchObservedRunningTime="2025-10-01 13:28:03.763315521 +0000 UTC m=+1343.817300420" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.795710 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 13:28:03 crc kubenswrapper[4749]: E1001 13:28:03.796129 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46286372-30ef-489e-8076-a65ad341d010" containerName="nova-cell1-conductor-db-sync" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.796146 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="46286372-30ef-489e-8076-a65ad341d010" containerName="nova-cell1-conductor-db-sync" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.796365 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="46286372-30ef-489e-8076-a65ad341d010" containerName="nova-cell1-conductor-db-sync" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.797067 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.802688 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.803512 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.80349534 podStartE2EDuration="2.80349534s" podCreationTimestamp="2025-10-01 13:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:28:03.787488628 +0000 UTC m=+1343.841473547" watchObservedRunningTime="2025-10-01 13:28:03.80349534 +0000 UTC m=+1343.857480259" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.860281 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.875393 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9d22f-227f-4f5a-8c9c-fc50845af518-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2fe9d22f-227f-4f5a-8c9c-fc50845af518\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.875449 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9d22f-227f-4f5a-8c9c-fc50845af518-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2fe9d22f-227f-4f5a-8c9c-fc50845af518\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.875533 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bgtp\" (UniqueName: \"kubernetes.io/projected/2fe9d22f-227f-4f5a-8c9c-fc50845af518-kube-api-access-4bgtp\") pod \"nova-cell1-conductor-0\" (UID: \"2fe9d22f-227f-4f5a-8c9c-fc50845af518\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.977758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bgtp\" (UniqueName: \"kubernetes.io/projected/2fe9d22f-227f-4f5a-8c9c-fc50845af518-kube-api-access-4bgtp\") pod \"nova-cell1-conductor-0\" (UID: \"2fe9d22f-227f-4f5a-8c9c-fc50845af518\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.977901 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9d22f-227f-4f5a-8c9c-fc50845af518-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2fe9d22f-227f-4f5a-8c9c-fc50845af518\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.977925 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9d22f-227f-4f5a-8c9c-fc50845af518-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2fe9d22f-227f-4f5a-8c9c-fc50845af518\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.982770 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fe9d22f-227f-4f5a-8c9c-fc50845af518-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2fe9d22f-227f-4f5a-8c9c-fc50845af518\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.982838 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fe9d22f-227f-4f5a-8c9c-fc50845af518-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2fe9d22f-227f-4f5a-8c9c-fc50845af518\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:03 crc kubenswrapper[4749]: I1001 13:28:03.995109 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bgtp\" (UniqueName: \"kubernetes.io/projected/2fe9d22f-227f-4f5a-8c9c-fc50845af518-kube-api-access-4bgtp\") pod \"nova-cell1-conductor-0\" (UID: \"2fe9d22f-227f-4f5a-8c9c-fc50845af518\") " pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:04 crc kubenswrapper[4749]: I1001 13:28:04.112126 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:04 crc kubenswrapper[4749]: I1001 13:28:04.222975 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 13:28:04 crc kubenswrapper[4749]: I1001 13:28:04.681875 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 13:28:04 crc kubenswrapper[4749]: W1001 13:28:04.696061 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fe9d22f_227f_4f5a_8c9c_fc50845af518.slice/crio-91b94acf5fe211f6929360e9826e77e6c22b1667073eb1fe8436f79df4b98d15 WatchSource:0}: Error finding container 91b94acf5fe211f6929360e9826e77e6c22b1667073eb1fe8436f79df4b98d15: Status 404 returned error can't find the container with id 91b94acf5fe211f6929360e9826e77e6c22b1667073eb1fe8436f79df4b98d15 Oct 01 13:28:04 crc kubenswrapper[4749]: I1001 13:28:04.764138 4749 generic.go:334] "Generic (PLEG): container finished" podID="79c6d349-9c67-41bf-b71c-8a1ce73e765e" containerID="723d9046411404939ef327a949756f01e2d9f84315b59d8d17355d1b64f598e9" exitCode=0 Oct 01 13:28:04 crc kubenswrapper[4749]: I1001 13:28:04.764184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"79c6d349-9c67-41bf-b71c-8a1ce73e765e","Type":"ContainerDied","Data":"723d9046411404939ef327a949756f01e2d9f84315b59d8d17355d1b64f598e9"} Oct 01 13:28:04 crc kubenswrapper[4749]: I1001 13:28:04.766633 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2fe9d22f-227f-4f5a-8c9c-fc50845af518","Type":"ContainerStarted","Data":"91b94acf5fe211f6929360e9826e77e6c22b1667073eb1fe8436f79df4b98d15"} Oct 01 13:28:04 crc kubenswrapper[4749]: I1001 13:28:04.769298 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc","Type":"ContainerStarted","Data":"8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f"} Oct 01 13:28:04 crc kubenswrapper[4749]: I1001 13:28:04.857120 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:28:04 crc kubenswrapper[4749]: I1001 13:28:04.871736 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.395448903 podStartE2EDuration="4.871721905s" podCreationTimestamp="2025-10-01 13:28:00 +0000 UTC" firstStartedPulling="2025-10-01 13:28:01.601469249 +0000 UTC m=+1341.655454148" lastFinishedPulling="2025-10-01 13:28:04.077742241 +0000 UTC m=+1344.131727150" observedRunningTime="2025-10-01 13:28:04.789767151 +0000 UTC m=+1344.843752050" watchObservedRunningTime="2025-10-01 13:28:04.871721905 +0000 UTC m=+1344.925706804" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.001288 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c6d349-9c67-41bf-b71c-8a1ce73e765e-combined-ca-bundle\") pod \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\" (UID: \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\") " Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.001456 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c6d349-9c67-41bf-b71c-8a1ce73e765e-config-data\") pod \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\" (UID: \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\") " Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.001508 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnwql\" (UniqueName: \"kubernetes.io/projected/79c6d349-9c67-41bf-b71c-8a1ce73e765e-kube-api-access-rnwql\") pod \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\" (UID: \"79c6d349-9c67-41bf-b71c-8a1ce73e765e\") " Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.006530 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c6d349-9c67-41bf-b71c-8a1ce73e765e-kube-api-access-rnwql" (OuterVolumeSpecName: "kube-api-access-rnwql") pod "79c6d349-9c67-41bf-b71c-8a1ce73e765e" (UID: "79c6d349-9c67-41bf-b71c-8a1ce73e765e"). InnerVolumeSpecName "kube-api-access-rnwql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.032626 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79c6d349-9c67-41bf-b71c-8a1ce73e765e-config-data" (OuterVolumeSpecName: "config-data") pod "79c6d349-9c67-41bf-b71c-8a1ce73e765e" (UID: "79c6d349-9c67-41bf-b71c-8a1ce73e765e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.033486 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79c6d349-9c67-41bf-b71c-8a1ce73e765e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79c6d349-9c67-41bf-b71c-8a1ce73e765e" (UID: "79c6d349-9c67-41bf-b71c-8a1ce73e765e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.103892 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79c6d349-9c67-41bf-b71c-8a1ce73e765e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.103945 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79c6d349-9c67-41bf-b71c-8a1ce73e765e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.103959 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnwql\" (UniqueName: \"kubernetes.io/projected/79c6d349-9c67-41bf-b71c-8a1ce73e765e-kube-api-access-rnwql\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.794179 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2fe9d22f-227f-4f5a-8c9c-fc50845af518","Type":"ContainerStarted","Data":"324c8fce4c90aed2da0082056b71c96898fb76a23445ab2a1855008477db344f"} Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.795892 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.802824 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.802818 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"79c6d349-9c67-41bf-b71c-8a1ce73e765e","Type":"ContainerDied","Data":"5cdbc4c03fa7bdcd216e7956fa27da2eb18208913cbf61e27c23c17ce004a0d4"} Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.802932 4749 scope.go:117] "RemoveContainer" containerID="723d9046411404939ef327a949756f01e2d9f84315b59d8d17355d1b64f598e9" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.803293 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.822777 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.822757619 podStartE2EDuration="2.822757619s" podCreationTimestamp="2025-10-01 13:28:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:28:05.819063492 +0000 UTC m=+1345.873048411" watchObservedRunningTime="2025-10-01 13:28:05.822757619 +0000 UTC m=+1345.876742528" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.891903 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.901718 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.912012 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:28:05 crc kubenswrapper[4749]: E1001 13:28:05.912594 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c6d349-9c67-41bf-b71c-8a1ce73e765e" containerName="nova-scheduler-scheduler" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.912617 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c6d349-9c67-41bf-b71c-8a1ce73e765e" containerName="nova-scheduler-scheduler" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.912900 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c6d349-9c67-41bf-b71c-8a1ce73e765e" containerName="nova-scheduler-scheduler" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.913687 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.916211 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 13:28:05 crc kubenswrapper[4749]: I1001 13:28:05.921716 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:28:06 crc kubenswrapper[4749]: I1001 13:28:06.020403 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-config-data\") pod \"nova-scheduler-0\" (UID: \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:06 crc kubenswrapper[4749]: I1001 13:28:06.020495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:06 crc kubenswrapper[4749]: I1001 13:28:06.020793 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbmls\" (UniqueName: \"kubernetes.io/projected/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-kube-api-access-cbmls\") pod \"nova-scheduler-0\" (UID: \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:06 crc kubenswrapper[4749]: I1001 13:28:06.122625 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbmls\" (UniqueName: \"kubernetes.io/projected/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-kube-api-access-cbmls\") pod \"nova-scheduler-0\" (UID: \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:06 crc kubenswrapper[4749]: I1001 13:28:06.122707 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-config-data\") pod \"nova-scheduler-0\" (UID: \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:06 crc kubenswrapper[4749]: I1001 13:28:06.122758 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:06 crc kubenswrapper[4749]: I1001 13:28:06.128731 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:06 crc kubenswrapper[4749]: I1001 13:28:06.130126 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-config-data\") pod \"nova-scheduler-0\" (UID: \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:06 crc kubenswrapper[4749]: I1001 13:28:06.147981 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbmls\" (UniqueName: \"kubernetes.io/projected/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-kube-api-access-cbmls\") pod \"nova-scheduler-0\" (UID: \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:06 crc kubenswrapper[4749]: I1001 13:28:06.245992 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:28:06 crc kubenswrapper[4749]: I1001 13:28:06.733419 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:28:06 crc kubenswrapper[4749]: I1001 13:28:06.817491 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5667a94-4bef-4b2c-8bc3-274f9090c5d9","Type":"ContainerStarted","Data":"03765646460aec5ea4a4f3e6c583deff142ef3c56fe6770274a983a044c7d4b9"} Oct 01 13:28:07 crc kubenswrapper[4749]: I1001 13:28:07.188986 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 13:28:07 crc kubenswrapper[4749]: I1001 13:28:07.189384 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 13:28:07 crc kubenswrapper[4749]: I1001 13:28:07.247166 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c6d349-9c67-41bf-b71c-8a1ce73e765e" path="/var/lib/kubelet/pods/79c6d349-9c67-41bf-b71c-8a1ce73e765e/volumes" Oct 01 13:28:07 crc kubenswrapper[4749]: I1001 13:28:07.829205 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5667a94-4bef-4b2c-8bc3-274f9090c5d9","Type":"ContainerStarted","Data":"2ad1aa5f539d66ddc4091b6edb29c82ae536a8178e4fe4ec2eeb18ac21401f71"} Oct 01 13:28:07 crc kubenswrapper[4749]: I1001 13:28:07.855876 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.855850067 podStartE2EDuration="2.855850067s" podCreationTimestamp="2025-10-01 13:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:28:07.855795386 +0000 UTC m=+1347.909780305" watchObservedRunningTime="2025-10-01 13:28:07.855850067 +0000 UTC m=+1347.909834986" Oct 01 13:28:09 crc kubenswrapper[4749]: I1001 13:28:09.165948 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 01 13:28:11 crc kubenswrapper[4749]: I1001 13:28:11.251435 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 13:28:12 crc kubenswrapper[4749]: I1001 13:28:12.167836 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:28:12 crc kubenswrapper[4749]: I1001 13:28:12.169134 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:28:12 crc kubenswrapper[4749]: I1001 13:28:12.189525 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 13:28:12 crc kubenswrapper[4749]: I1001 13:28:12.189570 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 13:28:13 crc kubenswrapper[4749]: I1001 13:28:13.260922 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:28:13 crc kubenswrapper[4749]: I1001 13:28:13.260979 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7762644e-6f15-46ba-82dc-de5a50a63f5b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:28:13 crc kubenswrapper[4749]: I1001 13:28:13.261026 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 13:28:13 crc kubenswrapper[4749]: I1001 13:28:13.261101 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7762644e-6f15-46ba-82dc-de5a50a63f5b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:28:16 crc kubenswrapper[4749]: I1001 13:28:16.246757 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 13:28:16 crc kubenswrapper[4749]: I1001 13:28:16.287441 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 13:28:16 crc kubenswrapper[4749]: I1001 13:28:16.960233 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.179037 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.179846 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.180269 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.180311 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.188661 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.189993 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.196093 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.196993 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.201127 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.453859 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79c7b4b87f-d7lt5"] Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.455688 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.478687 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79c7b4b87f-d7lt5"] Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.589157 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7xr4\" (UniqueName: \"kubernetes.io/projected/456a2974-f990-4ace-a841-a20ea0787247-kube-api-access-f7xr4\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.589267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-config\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.589288 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-dns-svc\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.589318 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-dns-swift-storage-0\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.589367 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-ovsdbserver-nb\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.589402 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-ovsdbserver-sb\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.701027 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-config\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.701084 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-dns-svc\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.701173 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-dns-swift-storage-0\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.701343 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-ovsdbserver-nb\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.701461 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-ovsdbserver-sb\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.701568 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7xr4\" (UniqueName: \"kubernetes.io/projected/456a2974-f990-4ace-a841-a20ea0787247-kube-api-access-f7xr4\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.703076 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-config\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.703286 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-dns-swift-storage-0\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.717567 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-ovsdbserver-sb\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.719356 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-ovsdbserver-nb\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.719656 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-dns-svc\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.755592 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7xr4\" (UniqueName: \"kubernetes.io/projected/456a2974-f990-4ace-a841-a20ea0787247-kube-api-access-f7xr4\") pod \"dnsmasq-dns-79c7b4b87f-d7lt5\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.782130 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:22 crc kubenswrapper[4749]: I1001 13:28:22.969313 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.020528 4749 generic.go:334] "Generic (PLEG): container finished" podID="05a66324-287f-4459-b601-e79c152c1ba0" containerID="a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21" exitCode=137 Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.020616 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05a66324-287f-4459-b601-e79c152c1ba0","Type":"ContainerDied","Data":"a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21"} Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.020659 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05a66324-287f-4459-b601-e79c152c1ba0","Type":"ContainerDied","Data":"f59ca6910cc1e1541e138444fe2167dfaf50e78680a2716b8d5c3e0a2f33e769"} Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.020678 4749 scope.go:117] "RemoveContainer" containerID="a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.021724 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.042644 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.069573 4749 scope.go:117] "RemoveContainer" containerID="a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21" Oct 01 13:28:23 crc kubenswrapper[4749]: E1001 13:28:23.072200 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21\": container with ID starting with a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21 not found: ID does not exist" containerID="a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.072250 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21"} err="failed to get container status \"a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21\": rpc error: code = NotFound desc = could not find container \"a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21\": container with ID starting with a023bddb015f103a34db2e3db5b6e2eabf7bd6be4a3c962e8ee16f64ff40ef21 not found: ID does not exist" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.110487 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkzcs\" (UniqueName: \"kubernetes.io/projected/05a66324-287f-4459-b601-e79c152c1ba0-kube-api-access-vkzcs\") pod \"05a66324-287f-4459-b601-e79c152c1ba0\" (UID: \"05a66324-287f-4459-b601-e79c152c1ba0\") " Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.110689 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a66324-287f-4459-b601-e79c152c1ba0-combined-ca-bundle\") pod \"05a66324-287f-4459-b601-e79c152c1ba0\" (UID: \"05a66324-287f-4459-b601-e79c152c1ba0\") " Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.110818 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a66324-287f-4459-b601-e79c152c1ba0-config-data\") pod \"05a66324-287f-4459-b601-e79c152c1ba0\" (UID: \"05a66324-287f-4459-b601-e79c152c1ba0\") " Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.117753 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05a66324-287f-4459-b601-e79c152c1ba0-kube-api-access-vkzcs" (OuterVolumeSpecName: "kube-api-access-vkzcs") pod "05a66324-287f-4459-b601-e79c152c1ba0" (UID: "05a66324-287f-4459-b601-e79c152c1ba0"). InnerVolumeSpecName "kube-api-access-vkzcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.141388 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a66324-287f-4459-b601-e79c152c1ba0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05a66324-287f-4459-b601-e79c152c1ba0" (UID: "05a66324-287f-4459-b601-e79c152c1ba0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.171367 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05a66324-287f-4459-b601-e79c152c1ba0-config-data" (OuterVolumeSpecName: "config-data") pod "05a66324-287f-4459-b601-e79c152c1ba0" (UID: "05a66324-287f-4459-b601-e79c152c1ba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.216442 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05a66324-287f-4459-b601-e79c152c1ba0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.216478 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05a66324-287f-4459-b601-e79c152c1ba0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.216489 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkzcs\" (UniqueName: \"kubernetes.io/projected/05a66324-287f-4459-b601-e79c152c1ba0-kube-api-access-vkzcs\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:23 crc kubenswrapper[4749]: W1001 13:28:23.278369 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod456a2974_f990_4ace_a841_a20ea0787247.slice/crio-89fe1422939946d91d7cd4827d137d4ebb1fd719a7e0026a892eb116aa48574b WatchSource:0}: Error finding container 89fe1422939946d91d7cd4827d137d4ebb1fd719a7e0026a892eb116aa48574b: Status 404 returned error can't find the container with id 89fe1422939946d91d7cd4827d137d4ebb1fd719a7e0026a892eb116aa48574b Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.287462 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79c7b4b87f-d7lt5"] Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.344026 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.350479 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.372176 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:28:23 crc kubenswrapper[4749]: E1001 13:28:23.372939 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05a66324-287f-4459-b601-e79c152c1ba0" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.372963 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="05a66324-287f-4459-b601-e79c152c1ba0" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.373324 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="05a66324-287f-4459-b601-e79c152c1ba0" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.374322 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.377323 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.377537 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.377705 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.412501 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.522036 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4293a98-bf1d-47e9-9c16-e272e6c836f7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.522352 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhzsg\" (UniqueName: \"kubernetes.io/projected/b4293a98-bf1d-47e9-9c16-e272e6c836f7-kube-api-access-nhzsg\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.522575 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4293a98-bf1d-47e9-9c16-e272e6c836f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.522743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4293a98-bf1d-47e9-9c16-e272e6c836f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.522932 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4293a98-bf1d-47e9-9c16-e272e6c836f7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.626888 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4293a98-bf1d-47e9-9c16-e272e6c836f7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.626995 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4293a98-bf1d-47e9-9c16-e272e6c836f7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.627047 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhzsg\" (UniqueName: \"kubernetes.io/projected/b4293a98-bf1d-47e9-9c16-e272e6c836f7-kube-api-access-nhzsg\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.627111 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4293a98-bf1d-47e9-9c16-e272e6c836f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.627185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4293a98-bf1d-47e9-9c16-e272e6c836f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.633278 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4293a98-bf1d-47e9-9c16-e272e6c836f7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.634340 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4293a98-bf1d-47e9-9c16-e272e6c836f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.636833 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4293a98-bf1d-47e9-9c16-e272e6c836f7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.644651 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4293a98-bf1d-47e9-9c16-e272e6c836f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.651768 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhzsg\" (UniqueName: \"kubernetes.io/projected/b4293a98-bf1d-47e9-9c16-e272e6c836f7-kube-api-access-nhzsg\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4293a98-bf1d-47e9-9c16-e272e6c836f7\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:23 crc kubenswrapper[4749]: I1001 13:28:23.704970 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:24 crc kubenswrapper[4749]: I1001 13:28:24.030500 4749 generic.go:334] "Generic (PLEG): container finished" podID="456a2974-f990-4ace-a841-a20ea0787247" containerID="88e32108ddfafe1d0f2afd2a67bd001ecde8c5b911bbab178a7a0e22361a1d20" exitCode=0 Oct 01 13:28:24 crc kubenswrapper[4749]: I1001 13:28:24.030771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" event={"ID":"456a2974-f990-4ace-a841-a20ea0787247","Type":"ContainerDied","Data":"88e32108ddfafe1d0f2afd2a67bd001ecde8c5b911bbab178a7a0e22361a1d20"} Oct 01 13:28:24 crc kubenswrapper[4749]: I1001 13:28:24.030796 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" event={"ID":"456a2974-f990-4ace-a841-a20ea0787247","Type":"ContainerStarted","Data":"89fe1422939946d91d7cd4827d137d4ebb1fd719a7e0026a892eb116aa48574b"} Oct 01 13:28:24 crc kubenswrapper[4749]: I1001 13:28:24.200510 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 13:28:24 crc kubenswrapper[4749]: W1001 13:28:24.203511 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4293a98_bf1d_47e9_9c16_e272e6c836f7.slice/crio-fa23acec7f148ab1f71d3f58e5763a8bc9f677cc88ad7e27ced31cdf74e1ae5c WatchSource:0}: Error finding container fa23acec7f148ab1f71d3f58e5763a8bc9f677cc88ad7e27ced31cdf74e1ae5c: Status 404 returned error can't find the container with id fa23acec7f148ab1f71d3f58e5763a8bc9f677cc88ad7e27ced31cdf74e1ae5c Oct 01 13:28:24 crc kubenswrapper[4749]: I1001 13:28:24.775894 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:24 crc kubenswrapper[4749]: I1001 13:28:24.922757 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:28:24 crc kubenswrapper[4749]: I1001 13:28:24.923072 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="sg-core" containerID="cri-o://a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155" gracePeriod=30 Oct 01 13:28:24 crc kubenswrapper[4749]: I1001 13:28:24.923114 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="ceilometer-notification-agent" containerID="cri-o://7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1" gracePeriod=30 Oct 01 13:28:24 crc kubenswrapper[4749]: I1001 13:28:24.923193 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="proxy-httpd" containerID="cri-o://8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f" gracePeriod=30 Oct 01 13:28:24 crc kubenswrapper[4749]: I1001 13:28:24.923301 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="ceilometer-central-agent" containerID="cri-o://8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6" gracePeriod=30 Oct 01 13:28:24 crc kubenswrapper[4749]: I1001 13:28:24.948186 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.211:3000/\": EOF" Oct 01 13:28:25 crc kubenswrapper[4749]: I1001 13:28:25.046861 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" event={"ID":"456a2974-f990-4ace-a841-a20ea0787247","Type":"ContainerStarted","Data":"bfcc875600a9130419aaec6147b5db7d06dad82c8071201adf605ca0b1109fec"} Oct 01 13:28:25 crc kubenswrapper[4749]: I1001 13:28:25.046981 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:25 crc kubenswrapper[4749]: I1001 13:28:25.050269 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4293a98-bf1d-47e9-9c16-e272e6c836f7","Type":"ContainerStarted","Data":"5128881b41fcfb07270d84c7c09a108aa6e612bf9c0e8344b3d1c81074e0f0cc"} Oct 01 13:28:25 crc kubenswrapper[4749]: I1001 13:28:25.050304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4293a98-bf1d-47e9-9c16-e272e6c836f7","Type":"ContainerStarted","Data":"fa23acec7f148ab1f71d3f58e5763a8bc9f677cc88ad7e27ced31cdf74e1ae5c"} Oct 01 13:28:25 crc kubenswrapper[4749]: I1001 13:28:25.069708 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" podStartSLOduration=3.069692492 podStartE2EDuration="3.069692492s" podCreationTimestamp="2025-10-01 13:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:28:25.061493134 +0000 UTC m=+1365.115478043" watchObservedRunningTime="2025-10-01 13:28:25.069692492 +0000 UTC m=+1365.123677391" Oct 01 13:28:25 crc kubenswrapper[4749]: I1001 13:28:25.069789 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerID="a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155" exitCode=2 Oct 01 13:28:25 crc kubenswrapper[4749]: I1001 13:28:25.069947 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" containerName="nova-api-log" containerID="cri-o://dff2cb79a21e89a17daf55ea1a475794c553ba6165c3ce83afce82e6a46be43e" gracePeriod=30 Oct 01 13:28:25 crc kubenswrapper[4749]: I1001 13:28:25.070153 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc","Type":"ContainerDied","Data":"a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155"} Oct 01 13:28:25 crc kubenswrapper[4749]: I1001 13:28:25.071000 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" containerName="nova-api-api" containerID="cri-o://e698349bb482e78848b0954f08cf804bb2012efbd15bb9f4663316a7a52451b9" gracePeriod=30 Oct 01 13:28:25 crc kubenswrapper[4749]: I1001 13:28:25.078484 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.078465466 podStartE2EDuration="2.078465466s" podCreationTimestamp="2025-10-01 13:28:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:28:25.078320162 +0000 UTC m=+1365.132305051" watchObservedRunningTime="2025-10-01 13:28:25.078465466 +0000 UTC m=+1365.132450365" Oct 01 13:28:25 crc kubenswrapper[4749]: I1001 13:28:25.245612 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05a66324-287f-4459-b601-e79c152c1ba0" path="/var/lib/kubelet/pods/05a66324-287f-4459-b601-e79c152c1ba0/volumes" Oct 01 13:28:25 crc kubenswrapper[4749]: E1001 13:28:25.486319 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc778d06_3a79_4164_a8d7_cdc0ae0b92fc.slice/crio-8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:28:26 crc kubenswrapper[4749]: I1001 13:28:26.090466 4749 generic.go:334] "Generic (PLEG): container finished" podID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" containerID="dff2cb79a21e89a17daf55ea1a475794c553ba6165c3ce83afce82e6a46be43e" exitCode=143 Oct 01 13:28:26 crc kubenswrapper[4749]: I1001 13:28:26.090736 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68","Type":"ContainerDied","Data":"dff2cb79a21e89a17daf55ea1a475794c553ba6165c3ce83afce82e6a46be43e"} Oct 01 13:28:26 crc kubenswrapper[4749]: I1001 13:28:26.093641 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerID="8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f" exitCode=0 Oct 01 13:28:26 crc kubenswrapper[4749]: I1001 13:28:26.093671 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerID="8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6" exitCode=0 Oct 01 13:28:26 crc kubenswrapper[4749]: I1001 13:28:26.093714 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc","Type":"ContainerDied","Data":"8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f"} Oct 01 13:28:26 crc kubenswrapper[4749]: I1001 13:28:26.093737 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc","Type":"ContainerDied","Data":"8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6"} Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.104640 4749 generic.go:334] "Generic (PLEG): container finished" podID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" containerID="e698349bb482e78848b0954f08cf804bb2012efbd15bb9f4663316a7a52451b9" exitCode=0 Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.104668 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68","Type":"ContainerDied","Data":"e698349bb482e78848b0954f08cf804bb2012efbd15bb9f4663316a7a52451b9"} Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.104976 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68","Type":"ContainerDied","Data":"fcb07ecc40026c66e54aba4761661d416ce8bb46fc2530a07d3029092d6d5e6b"} Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.104991 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcb07ecc40026c66e54aba4761661d416ce8bb46fc2530a07d3029092d6d5e6b" Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.180970 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.301646 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rzdg\" (UniqueName: \"kubernetes.io/projected/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-kube-api-access-4rzdg\") pod \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.301683 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-logs\") pod \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.301715 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-config-data\") pod \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.301797 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-combined-ca-bundle\") pod \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\" (UID: \"5a1f8c08-7fc4-4abe-876c-78b24a4b6b68\") " Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.303258 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-logs" (OuterVolumeSpecName: "logs") pod "5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" (UID: "5a1f8c08-7fc4-4abe-876c-78b24a4b6b68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.315396 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-kube-api-access-4rzdg" (OuterVolumeSpecName: "kube-api-access-4rzdg") pod "5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" (UID: "5a1f8c08-7fc4-4abe-876c-78b24a4b6b68"). InnerVolumeSpecName "kube-api-access-4rzdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.372792 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" (UID: "5a1f8c08-7fc4-4abe-876c-78b24a4b6b68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.375733 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-config-data" (OuterVolumeSpecName: "config-data") pod "5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" (UID: "5a1f8c08-7fc4-4abe-876c-78b24a4b6b68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.404719 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.404761 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rzdg\" (UniqueName: \"kubernetes.io/projected/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-kube-api-access-4rzdg\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.404778 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:27 crc kubenswrapper[4749]: I1001 13:28:27.404791 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.111638 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.121088 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerID="7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1" exitCode=0 Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.121142 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc","Type":"ContainerDied","Data":"7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1"} Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.121171 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.121182 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc","Type":"ContainerDied","Data":"3926e9129045ad063d84b48a8dd558bef04c2fe672b137f0db3f5dbe66e48bd2"} Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.121184 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.121235 4749 scope.go:117] "RemoveContainer" containerID="8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.166956 4749 scope.go:117] "RemoveContainer" containerID="a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.176519 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.186768 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.197466 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:28 crc kubenswrapper[4749]: E1001 13:28:28.198194 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" containerName="nova-api-log" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.198261 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" containerName="nova-api-log" Oct 01 13:28:28 crc kubenswrapper[4749]: E1001 13:28:28.198294 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="sg-core" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.198304 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="sg-core" Oct 01 13:28:28 crc kubenswrapper[4749]: E1001 13:28:28.198334 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="proxy-httpd" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.198344 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="proxy-httpd" Oct 01 13:28:28 crc kubenswrapper[4749]: E1001 13:28:28.198362 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="ceilometer-central-agent" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.198371 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="ceilometer-central-agent" Oct 01 13:28:28 crc kubenswrapper[4749]: E1001 13:28:28.198383 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="ceilometer-notification-agent" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.198390 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="ceilometer-notification-agent" Oct 01 13:28:28 crc kubenswrapper[4749]: E1001 13:28:28.198407 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" containerName="nova-api-api" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.198416 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" containerName="nova-api-api" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.198661 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="ceilometer-notification-agent" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.198684 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" containerName="nova-api-log" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.198697 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="ceilometer-central-agent" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.198711 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" containerName="nova-api-api" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.198725 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="proxy-httpd" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.198745 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" containerName="sg-core" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.200697 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.203367 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.204119 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.204539 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.220889 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-combined-ca-bundle\") pod \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.220991 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-config-data\") pod \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.221035 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp7c2\" (UniqueName: \"kubernetes.io/projected/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-kube-api-access-lp7c2\") pod \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.221091 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-log-httpd\") pod \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.221277 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-run-httpd\") pod \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.221315 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-scripts\") pod \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.221389 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-sg-core-conf-yaml\") pod \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.221419 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-ceilometer-tls-certs\") pod \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\" (UID: \"dc778d06-3a79-4164-a8d7-cdc0ae0b92fc\") " Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.223901 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" (UID: "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.224067 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" (UID: "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.226645 4749 scope.go:117] "RemoveContainer" containerID="7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.228308 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.231135 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-scripts" (OuterVolumeSpecName: "scripts") pod "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" (UID: "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.241601 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-kube-api-access-lp7c2" (OuterVolumeSpecName: "kube-api-access-lp7c2") pod "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" (UID: "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc"). InnerVolumeSpecName "kube-api-access-lp7c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.283963 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" (UID: "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.293669 4749 scope.go:117] "RemoveContainer" containerID="8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.303674 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" (UID: "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.313018 4749 scope.go:117] "RemoveContainer" containerID="8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f" Oct 01 13:28:28 crc kubenswrapper[4749]: E1001 13:28:28.313635 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f\": container with ID starting with 8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f not found: ID does not exist" containerID="8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.313666 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f"} err="failed to get container status \"8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f\": rpc error: code = NotFound desc = could not find container \"8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f\": container with ID starting with 8edc69a969ec5e11e568c9783592b2961b7c958c29effe8edd76bfb9fd08279f not found: ID does not exist" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.313685 4749 scope.go:117] "RemoveContainer" containerID="a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155" Oct 01 13:28:28 crc kubenswrapper[4749]: E1001 13:28:28.313991 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155\": container with ID starting with a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155 not found: ID does not exist" containerID="a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.314127 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155"} err="failed to get container status \"a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155\": rpc error: code = NotFound desc = could not find container \"a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155\": container with ID starting with a6353dd4442861fab195175874775f086a51245276e367c0f73e449f38d4a155 not found: ID does not exist" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.314298 4749 scope.go:117] "RemoveContainer" containerID="7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1" Oct 01 13:28:28 crc kubenswrapper[4749]: E1001 13:28:28.314628 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1\": container with ID starting with 7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1 not found: ID does not exist" containerID="7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.314651 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1"} err="failed to get container status \"7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1\": rpc error: code = NotFound desc = could not find container \"7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1\": container with ID starting with 7b9d669fca97703ae70c79bc9d2cad26550c0b2864b67ca6fd0033b6131502e1 not found: ID does not exist" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.314669 4749 scope.go:117] "RemoveContainer" containerID="8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6" Oct 01 13:28:28 crc kubenswrapper[4749]: E1001 13:28:28.315321 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6\": container with ID starting with 8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6 not found: ID does not exist" containerID="8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.315359 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6"} err="failed to get container status \"8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6\": rpc error: code = NotFound desc = could not find container \"8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6\": container with ID starting with 8c0ece7c830f5e3ef1fdbb828a80dca2071f14fa0b89b4d839e7ea57672d05f6 not found: ID does not exist" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.323439 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh5hw\" (UniqueName: \"kubernetes.io/projected/67cc6737-a83a-4812-8f6c-6c62a756676c-kube-api-access-wh5hw\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.323492 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67cc6737-a83a-4812-8f6c-6c62a756676c-logs\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.323513 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-public-tls-certs\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.323560 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.323576 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.323596 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-config-data\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.323765 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp7c2\" (UniqueName: \"kubernetes.io/projected/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-kube-api-access-lp7c2\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.323777 4749 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.323785 4749 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.323792 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.323800 4749 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.323807 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.331579 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" (UID: "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.361613 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-config-data" (OuterVolumeSpecName: "config-data") pod "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" (UID: "dc778d06-3a79-4164-a8d7-cdc0ae0b92fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.426152 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh5hw\" (UniqueName: \"kubernetes.io/projected/67cc6737-a83a-4812-8f6c-6c62a756676c-kube-api-access-wh5hw\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.426315 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67cc6737-a83a-4812-8f6c-6c62a756676c-logs\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.426357 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-public-tls-certs\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.426421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.426454 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.426492 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-config-data\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.426667 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.426687 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.427020 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67cc6737-a83a-4812-8f6c-6c62a756676c-logs\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.432497 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.432511 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-config-data\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.432817 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-public-tls-certs\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.433920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.444995 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh5hw\" (UniqueName: \"kubernetes.io/projected/67cc6737-a83a-4812-8f6c-6c62a756676c-kube-api-access-wh5hw\") pod \"nova-api-0\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.485361 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.493304 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.502478 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.505294 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.507382 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.507968 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.510851 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.510894 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.521352 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.637745 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.638709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.638754 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8443de13-c4a9-420c-a4ff-5aa54d222850-run-httpd\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.638816 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-config-data\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.639020 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-scripts\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.639247 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.639319 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8443de13-c4a9-420c-a4ff-5aa54d222850-log-httpd\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.639383 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl86b\" (UniqueName: \"kubernetes.io/projected/8443de13-c4a9-420c-a4ff-5aa54d222850-kube-api-access-rl86b\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.705834 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.741485 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.741532 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8443de13-c4a9-420c-a4ff-5aa54d222850-run-httpd\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.741573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-config-data\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.741640 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-scripts\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.741689 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.741725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8443de13-c4a9-420c-a4ff-5aa54d222850-log-httpd\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.741762 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl86b\" (UniqueName: \"kubernetes.io/projected/8443de13-c4a9-420c-a4ff-5aa54d222850-kube-api-access-rl86b\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.741791 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.745470 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8443de13-c4a9-420c-a4ff-5aa54d222850-log-httpd\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.745562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8443de13-c4a9-420c-a4ff-5aa54d222850-run-httpd\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.748872 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-scripts\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.748930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.749403 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.749448 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.751721 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8443de13-c4a9-420c-a4ff-5aa54d222850-config-data\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.759829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl86b\" (UniqueName: \"kubernetes.io/projected/8443de13-c4a9-420c-a4ff-5aa54d222850-kube-api-access-rl86b\") pod \"ceilometer-0\" (UID: \"8443de13-c4a9-420c-a4ff-5aa54d222850\") " pod="openstack/ceilometer-0" Oct 01 13:28:28 crc kubenswrapper[4749]: I1001 13:28:28.849650 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 13:28:29 crc kubenswrapper[4749]: I1001 13:28:29.035636 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:29 crc kubenswrapper[4749]: W1001 13:28:29.053786 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67cc6737_a83a_4812_8f6c_6c62a756676c.slice/crio-ed7e6f8bc0aca2593410aa95e42501abe4a9153d0934ee70b1d0961cbe274ea7 WatchSource:0}: Error finding container ed7e6f8bc0aca2593410aa95e42501abe4a9153d0934ee70b1d0961cbe274ea7: Status 404 returned error can't find the container with id ed7e6f8bc0aca2593410aa95e42501abe4a9153d0934ee70b1d0961cbe274ea7 Oct 01 13:28:29 crc kubenswrapper[4749]: I1001 13:28:29.133860 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67cc6737-a83a-4812-8f6c-6c62a756676c","Type":"ContainerStarted","Data":"ed7e6f8bc0aca2593410aa95e42501abe4a9153d0934ee70b1d0961cbe274ea7"} Oct 01 13:28:29 crc kubenswrapper[4749]: I1001 13:28:29.248259 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1f8c08-7fc4-4abe-876c-78b24a4b6b68" path="/var/lib/kubelet/pods/5a1f8c08-7fc4-4abe-876c-78b24a4b6b68/volumes" Oct 01 13:28:29 crc kubenswrapper[4749]: I1001 13:28:29.249295 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc778d06-3a79-4164-a8d7-cdc0ae0b92fc" path="/var/lib/kubelet/pods/dc778d06-3a79-4164-a8d7-cdc0ae0b92fc/volumes" Oct 01 13:28:29 crc kubenswrapper[4749]: I1001 13:28:29.327165 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 13:28:29 crc kubenswrapper[4749]: W1001 13:28:29.331957 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8443de13_c4a9_420c_a4ff_5aa54d222850.slice/crio-6c7f6ac58ba1966f785e84b1302ad5531d3b4460d3400c2ce14d4ba27a650bc0 WatchSource:0}: Error finding container 6c7f6ac58ba1966f785e84b1302ad5531d3b4460d3400c2ce14d4ba27a650bc0: Status 404 returned error can't find the container with id 6c7f6ac58ba1966f785e84b1302ad5531d3b4460d3400c2ce14d4ba27a650bc0 Oct 01 13:28:30 crc kubenswrapper[4749]: I1001 13:28:30.147374 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8443de13-c4a9-420c-a4ff-5aa54d222850","Type":"ContainerStarted","Data":"5b2cb7c66f1bc3c96d6a4b49084faed3e0b42040811a4869ef6d6d76ff7b4119"} Oct 01 13:28:30 crc kubenswrapper[4749]: I1001 13:28:30.147738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8443de13-c4a9-420c-a4ff-5aa54d222850","Type":"ContainerStarted","Data":"5f2da3e238e36749814e422dc847fe16f5edc414c23b72a04fc128de0e67bc9e"} Oct 01 13:28:30 crc kubenswrapper[4749]: I1001 13:28:30.147760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8443de13-c4a9-420c-a4ff-5aa54d222850","Type":"ContainerStarted","Data":"6c7f6ac58ba1966f785e84b1302ad5531d3b4460d3400c2ce14d4ba27a650bc0"} Oct 01 13:28:30 crc kubenswrapper[4749]: I1001 13:28:30.149678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67cc6737-a83a-4812-8f6c-6c62a756676c","Type":"ContainerStarted","Data":"0830108309a6537f583a1496bb71b49d00ca0ebfa3d63b0bfa4ab1527d5847b8"} Oct 01 13:28:30 crc kubenswrapper[4749]: I1001 13:28:30.149713 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67cc6737-a83a-4812-8f6c-6c62a756676c","Type":"ContainerStarted","Data":"7b5e15ca7b9ecee46e513c989f2b0c6411d304b27e0a9a869ea5b3e5f591e6af"} Oct 01 13:28:30 crc kubenswrapper[4749]: I1001 13:28:30.183713 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.183683409 podStartE2EDuration="2.183683409s" podCreationTimestamp="2025-10-01 13:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:28:30.172162505 +0000 UTC m=+1370.226147464" watchObservedRunningTime="2025-10-01 13:28:30.183683409 +0000 UTC m=+1370.237668338" Oct 01 13:28:31 crc kubenswrapper[4749]: I1001 13:28:31.164690 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8443de13-c4a9-420c-a4ff-5aa54d222850","Type":"ContainerStarted","Data":"26eb8fce0d857bdb13e2038910a0fac1ec3c9ca8cb82da2add789a27125128ff"} Oct 01 13:28:32 crc kubenswrapper[4749]: I1001 13:28:32.783814 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:28:32 crc kubenswrapper[4749]: I1001 13:28:32.855088 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599b6fdf6c-xm8tc"] Oct 01 13:28:32 crc kubenswrapper[4749]: I1001 13:28:32.855407 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" podUID="67837f8f-64b7-46b6-868c-8a9abb273f36" containerName="dnsmasq-dns" containerID="cri-o://f7b72765801a8a7084e320c5f82aec38e7861ac383f821b991f8545bdb648673" gracePeriod=10 Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.187629 4749 generic.go:334] "Generic (PLEG): container finished" podID="67837f8f-64b7-46b6-868c-8a9abb273f36" containerID="f7b72765801a8a7084e320c5f82aec38e7861ac383f821b991f8545bdb648673" exitCode=0 Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.187713 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" event={"ID":"67837f8f-64b7-46b6-868c-8a9abb273f36","Type":"ContainerDied","Data":"f7b72765801a8a7084e320c5f82aec38e7861ac383f821b991f8545bdb648673"} Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.191042 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8443de13-c4a9-420c-a4ff-5aa54d222850","Type":"ContainerStarted","Data":"3914f68487e8bb996a23120f7c0b484ed971349e74dead3f1fb141fd5f58717a"} Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.191205 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.233763 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.113863047 podStartE2EDuration="5.233739928s" podCreationTimestamp="2025-10-01 13:28:28 +0000 UTC" firstStartedPulling="2025-10-01 13:28:29.334787402 +0000 UTC m=+1369.388772301" lastFinishedPulling="2025-10-01 13:28:32.454664243 +0000 UTC m=+1372.508649182" observedRunningTime="2025-10-01 13:28:33.215434208 +0000 UTC m=+1373.269419107" watchObservedRunningTime="2025-10-01 13:28:33.233739928 +0000 UTC m=+1373.287724827" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.352538 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.434151 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-config\") pod \"67837f8f-64b7-46b6-868c-8a9abb273f36\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.434199 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-dns-svc\") pod \"67837f8f-64b7-46b6-868c-8a9abb273f36\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.434247 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mshtn\" (UniqueName: \"kubernetes.io/projected/67837f8f-64b7-46b6-868c-8a9abb273f36-kube-api-access-mshtn\") pod \"67837f8f-64b7-46b6-868c-8a9abb273f36\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.434332 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-ovsdbserver-nb\") pod \"67837f8f-64b7-46b6-868c-8a9abb273f36\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.434442 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-ovsdbserver-sb\") pod \"67837f8f-64b7-46b6-868c-8a9abb273f36\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.434506 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-dns-swift-storage-0\") pod \"67837f8f-64b7-46b6-868c-8a9abb273f36\" (UID: \"67837f8f-64b7-46b6-868c-8a9abb273f36\") " Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.441950 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67837f8f-64b7-46b6-868c-8a9abb273f36-kube-api-access-mshtn" (OuterVolumeSpecName: "kube-api-access-mshtn") pod "67837f8f-64b7-46b6-868c-8a9abb273f36" (UID: "67837f8f-64b7-46b6-868c-8a9abb273f36"). InnerVolumeSpecName "kube-api-access-mshtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.486689 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67837f8f-64b7-46b6-868c-8a9abb273f36" (UID: "67837f8f-64b7-46b6-868c-8a9abb273f36"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.490847 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-config" (OuterVolumeSpecName: "config") pod "67837f8f-64b7-46b6-868c-8a9abb273f36" (UID: "67837f8f-64b7-46b6-868c-8a9abb273f36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.496677 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67837f8f-64b7-46b6-868c-8a9abb273f36" (UID: "67837f8f-64b7-46b6-868c-8a9abb273f36"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.498479 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67837f8f-64b7-46b6-868c-8a9abb273f36" (UID: "67837f8f-64b7-46b6-868c-8a9abb273f36"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.514256 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "67837f8f-64b7-46b6-868c-8a9abb273f36" (UID: "67837f8f-64b7-46b6-868c-8a9abb273f36"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.538550 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.538822 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.538831 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.538838 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mshtn\" (UniqueName: \"kubernetes.io/projected/67837f8f-64b7-46b6-868c-8a9abb273f36-kube-api-access-mshtn\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.538849 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.538857 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67837f8f-64b7-46b6-868c-8a9abb273f36-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.705267 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:33 crc kubenswrapper[4749]: I1001 13:28:33.726618 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.205301 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.205351 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-599b6fdf6c-xm8tc" event={"ID":"67837f8f-64b7-46b6-868c-8a9abb273f36","Type":"ContainerDied","Data":"377c4c29e43f43e62e8dc07e827cf5dcb890618dbf57c449692d4302e9b57661"} Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.206514 4749 scope.go:117] "RemoveContainer" containerID="f7b72765801a8a7084e320c5f82aec38e7861ac383f821b991f8545bdb648673" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.225997 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.246730 4749 scope.go:117] "RemoveContainer" containerID="3d9bba667ef9acca67b52baaba8b9de703baf6dc622f09faa22af62d7fb18c73" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.247550 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-599b6fdf6c-xm8tc"] Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.264438 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-599b6fdf6c-xm8tc"] Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.491159 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-z7885"] Oct 01 13:28:34 crc kubenswrapper[4749]: E1001 13:28:34.491570 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67837f8f-64b7-46b6-868c-8a9abb273f36" containerName="init" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.491582 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="67837f8f-64b7-46b6-868c-8a9abb273f36" containerName="init" Oct 01 13:28:34 crc kubenswrapper[4749]: E1001 13:28:34.491599 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67837f8f-64b7-46b6-868c-8a9abb273f36" containerName="dnsmasq-dns" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.491605 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="67837f8f-64b7-46b6-868c-8a9abb273f36" containerName="dnsmasq-dns" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.491783 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="67837f8f-64b7-46b6-868c-8a9abb273f36" containerName="dnsmasq-dns" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.492408 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.495202 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.496242 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.504401 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z7885"] Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.563345 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv4kx\" (UniqueName: \"kubernetes.io/projected/4b410813-5ad4-464d-a144-34501ae862c1-kube-api-access-lv4kx\") pod \"nova-cell1-cell-mapping-z7885\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.563433 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-scripts\") pod \"nova-cell1-cell-mapping-z7885\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.563490 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z7885\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.563529 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-config-data\") pod \"nova-cell1-cell-mapping-z7885\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.665498 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-scripts\") pod \"nova-cell1-cell-mapping-z7885\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.665907 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z7885\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.665961 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-config-data\") pod \"nova-cell1-cell-mapping-z7885\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.666037 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv4kx\" (UniqueName: \"kubernetes.io/projected/4b410813-5ad4-464d-a144-34501ae862c1-kube-api-access-lv4kx\") pod \"nova-cell1-cell-mapping-z7885\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.672252 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-scripts\") pod \"nova-cell1-cell-mapping-z7885\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.673562 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-config-data\") pod \"nova-cell1-cell-mapping-z7885\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.674162 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-z7885\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.683671 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv4kx\" (UniqueName: \"kubernetes.io/projected/4b410813-5ad4-464d-a144-34501ae862c1-kube-api-access-lv4kx\") pod \"nova-cell1-cell-mapping-z7885\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:34 crc kubenswrapper[4749]: I1001 13:28:34.849178 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:35 crc kubenswrapper[4749]: I1001 13:28:35.252176 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67837f8f-64b7-46b6-868c-8a9abb273f36" path="/var/lib/kubelet/pods/67837f8f-64b7-46b6-868c-8a9abb273f36/volumes" Oct 01 13:28:35 crc kubenswrapper[4749]: I1001 13:28:35.349681 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-z7885"] Oct 01 13:28:36 crc kubenswrapper[4749]: I1001 13:28:36.225078 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z7885" event={"ID":"4b410813-5ad4-464d-a144-34501ae862c1","Type":"ContainerStarted","Data":"23d0d19b2b004bd8022fb1f8982fcb13b59edbbbe30ba9c98ad02f4458766cd1"} Oct 01 13:28:36 crc kubenswrapper[4749]: I1001 13:28:36.225824 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z7885" event={"ID":"4b410813-5ad4-464d-a144-34501ae862c1","Type":"ContainerStarted","Data":"dbc290e98a4fbb4c9843671ae0166ac1818a03ca9964202c0d02483e92dcedb3"} Oct 01 13:28:36 crc kubenswrapper[4749]: I1001 13:28:36.244845 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-z7885" podStartSLOduration=2.244826708 podStartE2EDuration="2.244826708s" podCreationTimestamp="2025-10-01 13:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:28:36.241751849 +0000 UTC m=+1376.295736758" watchObservedRunningTime="2025-10-01 13:28:36.244826708 +0000 UTC m=+1376.298811617" Oct 01 13:28:38 crc kubenswrapper[4749]: I1001 13:28:38.521745 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:28:38 crc kubenswrapper[4749]: I1001 13:28:38.522072 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:28:39 crc kubenswrapper[4749]: I1001 13:28:39.544565 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="67cc6737-a83a-4812-8f6c-6c62a756676c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:28:39 crc kubenswrapper[4749]: I1001 13:28:39.544574 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="67cc6737-a83a-4812-8f6c-6c62a756676c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:28:41 crc kubenswrapper[4749]: I1001 13:28:41.295828 4749 generic.go:334] "Generic (PLEG): container finished" podID="4b410813-5ad4-464d-a144-34501ae862c1" containerID="23d0d19b2b004bd8022fb1f8982fcb13b59edbbbe30ba9c98ad02f4458766cd1" exitCode=0 Oct 01 13:28:41 crc kubenswrapper[4749]: I1001 13:28:41.295906 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z7885" event={"ID":"4b410813-5ad4-464d-a144-34501ae862c1","Type":"ContainerDied","Data":"23d0d19b2b004bd8022fb1f8982fcb13b59edbbbe30ba9c98ad02f4458766cd1"} Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.787911 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.839099 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-scripts\") pod \"4b410813-5ad4-464d-a144-34501ae862c1\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.839357 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-combined-ca-bundle\") pod \"4b410813-5ad4-464d-a144-34501ae862c1\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.839628 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-config-data\") pod \"4b410813-5ad4-464d-a144-34501ae862c1\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.839707 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lv4kx\" (UniqueName: \"kubernetes.io/projected/4b410813-5ad4-464d-a144-34501ae862c1-kube-api-access-lv4kx\") pod \"4b410813-5ad4-464d-a144-34501ae862c1\" (UID: \"4b410813-5ad4-464d-a144-34501ae862c1\") " Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.846660 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-scripts" (OuterVolumeSpecName: "scripts") pod "4b410813-5ad4-464d-a144-34501ae862c1" (UID: "4b410813-5ad4-464d-a144-34501ae862c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.846871 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b410813-5ad4-464d-a144-34501ae862c1-kube-api-access-lv4kx" (OuterVolumeSpecName: "kube-api-access-lv4kx") pod "4b410813-5ad4-464d-a144-34501ae862c1" (UID: "4b410813-5ad4-464d-a144-34501ae862c1"). InnerVolumeSpecName "kube-api-access-lv4kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.881504 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-config-data" (OuterVolumeSpecName: "config-data") pod "4b410813-5ad4-464d-a144-34501ae862c1" (UID: "4b410813-5ad4-464d-a144-34501ae862c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.889937 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b410813-5ad4-464d-a144-34501ae862c1" (UID: "4b410813-5ad4-464d-a144-34501ae862c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.942440 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.942495 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lv4kx\" (UniqueName: \"kubernetes.io/projected/4b410813-5ad4-464d-a144-34501ae862c1-kube-api-access-lv4kx\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.942512 4749 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:42 crc kubenswrapper[4749]: I1001 13:28:42.942524 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b410813-5ad4-464d-a144-34501ae862c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:43 crc kubenswrapper[4749]: I1001 13:28:43.323441 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-z7885" event={"ID":"4b410813-5ad4-464d-a144-34501ae862c1","Type":"ContainerDied","Data":"dbc290e98a4fbb4c9843671ae0166ac1818a03ca9964202c0d02483e92dcedb3"} Oct 01 13:28:43 crc kubenswrapper[4749]: I1001 13:28:43.323491 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbc290e98a4fbb4c9843671ae0166ac1818a03ca9964202c0d02483e92dcedb3" Oct 01 13:28:43 crc kubenswrapper[4749]: I1001 13:28:43.323535 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-z7885" Oct 01 13:28:43 crc kubenswrapper[4749]: I1001 13:28:43.491616 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:43 crc kubenswrapper[4749]: I1001 13:28:43.491846 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="67cc6737-a83a-4812-8f6c-6c62a756676c" containerName="nova-api-log" containerID="cri-o://7b5e15ca7b9ecee46e513c989f2b0c6411d304b27e0a9a869ea5b3e5f591e6af" gracePeriod=30 Oct 01 13:28:43 crc kubenswrapper[4749]: I1001 13:28:43.491969 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="67cc6737-a83a-4812-8f6c-6c62a756676c" containerName="nova-api-api" containerID="cri-o://0830108309a6537f583a1496bb71b49d00ca0ebfa3d63b0bfa4ab1527d5847b8" gracePeriod=30 Oct 01 13:28:43 crc kubenswrapper[4749]: I1001 13:28:43.508043 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:28:43 crc kubenswrapper[4749]: I1001 13:28:43.508314 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f5667a94-4bef-4b2c-8bc3-274f9090c5d9" containerName="nova-scheduler-scheduler" containerID="cri-o://2ad1aa5f539d66ddc4091b6edb29c82ae536a8178e4fe4ec2eeb18ac21401f71" gracePeriod=30 Oct 01 13:28:43 crc kubenswrapper[4749]: I1001 13:28:43.525256 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:28:43 crc kubenswrapper[4749]: I1001 13:28:43.525527 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7762644e-6f15-46ba-82dc-de5a50a63f5b" containerName="nova-metadata-log" containerID="cri-o://32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c" gracePeriod=30 Oct 01 13:28:43 crc kubenswrapper[4749]: I1001 13:28:43.525631 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7762644e-6f15-46ba-82dc-de5a50a63f5b" containerName="nova-metadata-metadata" containerID="cri-o://1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950" gracePeriod=30 Oct 01 13:28:44 crc kubenswrapper[4749]: I1001 13:28:44.337051 4749 generic.go:334] "Generic (PLEG): container finished" podID="67cc6737-a83a-4812-8f6c-6c62a756676c" containerID="7b5e15ca7b9ecee46e513c989f2b0c6411d304b27e0a9a869ea5b3e5f591e6af" exitCode=143 Oct 01 13:28:44 crc kubenswrapper[4749]: I1001 13:28:44.337143 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67cc6737-a83a-4812-8f6c-6c62a756676c","Type":"ContainerDied","Data":"7b5e15ca7b9ecee46e513c989f2b0c6411d304b27e0a9a869ea5b3e5f591e6af"} Oct 01 13:28:44 crc kubenswrapper[4749]: I1001 13:28:44.340242 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7762644e-6f15-46ba-82dc-de5a50a63f5b","Type":"ContainerDied","Data":"32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c"} Oct 01 13:28:44 crc kubenswrapper[4749]: I1001 13:28:44.340240 4749 generic.go:334] "Generic (PLEG): container finished" podID="7762644e-6f15-46ba-82dc-de5a50a63f5b" containerID="32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c" exitCode=143 Oct 01 13:28:44 crc kubenswrapper[4749]: I1001 13:28:44.869615 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:28:44 crc kubenswrapper[4749]: I1001 13:28:44.980491 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-nova-metadata-tls-certs\") pod \"7762644e-6f15-46ba-82dc-de5a50a63f5b\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " Oct 01 13:28:44 crc kubenswrapper[4749]: I1001 13:28:44.980618 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzdh7\" (UniqueName: \"kubernetes.io/projected/7762644e-6f15-46ba-82dc-de5a50a63f5b-kube-api-access-lzdh7\") pod \"7762644e-6f15-46ba-82dc-de5a50a63f5b\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " Oct 01 13:28:44 crc kubenswrapper[4749]: I1001 13:28:44.980705 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-config-data\") pod \"7762644e-6f15-46ba-82dc-de5a50a63f5b\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " Oct 01 13:28:44 crc kubenswrapper[4749]: I1001 13:28:44.980788 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-combined-ca-bundle\") pod \"7762644e-6f15-46ba-82dc-de5a50a63f5b\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " Oct 01 13:28:44 crc kubenswrapper[4749]: I1001 13:28:44.980861 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7762644e-6f15-46ba-82dc-de5a50a63f5b-logs\") pod \"7762644e-6f15-46ba-82dc-de5a50a63f5b\" (UID: \"7762644e-6f15-46ba-82dc-de5a50a63f5b\") " Oct 01 13:28:44 crc kubenswrapper[4749]: I1001 13:28:44.981569 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7762644e-6f15-46ba-82dc-de5a50a63f5b-logs" (OuterVolumeSpecName: "logs") pod "7762644e-6f15-46ba-82dc-de5a50a63f5b" (UID: "7762644e-6f15-46ba-82dc-de5a50a63f5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:28:44 crc kubenswrapper[4749]: I1001 13:28:44.989842 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7762644e-6f15-46ba-82dc-de5a50a63f5b-kube-api-access-lzdh7" (OuterVolumeSpecName: "kube-api-access-lzdh7") pod "7762644e-6f15-46ba-82dc-de5a50a63f5b" (UID: "7762644e-6f15-46ba-82dc-de5a50a63f5b"). InnerVolumeSpecName "kube-api-access-lzdh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.009610 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-config-data" (OuterVolumeSpecName: "config-data") pod "7762644e-6f15-46ba-82dc-de5a50a63f5b" (UID: "7762644e-6f15-46ba-82dc-de5a50a63f5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.013999 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7762644e-6f15-46ba-82dc-de5a50a63f5b" (UID: "7762644e-6f15-46ba-82dc-de5a50a63f5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.048276 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7762644e-6f15-46ba-82dc-de5a50a63f5b" (UID: "7762644e-6f15-46ba-82dc-de5a50a63f5b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.083267 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.083308 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7762644e-6f15-46ba-82dc-de5a50a63f5b-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.083321 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.083436 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzdh7\" (UniqueName: \"kubernetes.io/projected/7762644e-6f15-46ba-82dc-de5a50a63f5b-kube-api-access-lzdh7\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.083483 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7762644e-6f15-46ba-82dc-de5a50a63f5b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.362397 4749 generic.go:334] "Generic (PLEG): container finished" podID="67cc6737-a83a-4812-8f6c-6c62a756676c" containerID="0830108309a6537f583a1496bb71b49d00ca0ebfa3d63b0bfa4ab1527d5847b8" exitCode=0 Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.362483 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67cc6737-a83a-4812-8f6c-6c62a756676c","Type":"ContainerDied","Data":"0830108309a6537f583a1496bb71b49d00ca0ebfa3d63b0bfa4ab1527d5847b8"} Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.366031 4749 generic.go:334] "Generic (PLEG): container finished" podID="f5667a94-4bef-4b2c-8bc3-274f9090c5d9" containerID="2ad1aa5f539d66ddc4091b6edb29c82ae536a8178e4fe4ec2eeb18ac21401f71" exitCode=0 Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.366086 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5667a94-4bef-4b2c-8bc3-274f9090c5d9","Type":"ContainerDied","Data":"2ad1aa5f539d66ddc4091b6edb29c82ae536a8178e4fe4ec2eeb18ac21401f71"} Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.368058 4749 generic.go:334] "Generic (PLEG): container finished" podID="7762644e-6f15-46ba-82dc-de5a50a63f5b" containerID="1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950" exitCode=0 Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.368084 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7762644e-6f15-46ba-82dc-de5a50a63f5b","Type":"ContainerDied","Data":"1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950"} Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.368099 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7762644e-6f15-46ba-82dc-de5a50a63f5b","Type":"ContainerDied","Data":"fe4d93741d735a3dcaf44cc01df5dc05bae78ba65c01f0fe30f04c59265fb5eb"} Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.368117 4749 scope.go:117] "RemoveContainer" containerID="1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.368325 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.373502 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.433988 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.440470 4749 scope.go:117] "RemoveContainer" containerID="32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.449993 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.464056 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:28:45 crc kubenswrapper[4749]: E1001 13:28:45.464527 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b410813-5ad4-464d-a144-34501ae862c1" containerName="nova-manage" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.464546 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b410813-5ad4-464d-a144-34501ae862c1" containerName="nova-manage" Oct 01 13:28:45 crc kubenswrapper[4749]: E1001 13:28:45.464567 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5667a94-4bef-4b2c-8bc3-274f9090c5d9" containerName="nova-scheduler-scheduler" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.464573 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5667a94-4bef-4b2c-8bc3-274f9090c5d9" containerName="nova-scheduler-scheduler" Oct 01 13:28:45 crc kubenswrapper[4749]: E1001 13:28:45.464591 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7762644e-6f15-46ba-82dc-de5a50a63f5b" containerName="nova-metadata-log" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.464596 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7762644e-6f15-46ba-82dc-de5a50a63f5b" containerName="nova-metadata-log" Oct 01 13:28:45 crc kubenswrapper[4749]: E1001 13:28:45.464609 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7762644e-6f15-46ba-82dc-de5a50a63f5b" containerName="nova-metadata-metadata" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.464615 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7762644e-6f15-46ba-82dc-de5a50a63f5b" containerName="nova-metadata-metadata" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.464792 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5667a94-4bef-4b2c-8bc3-274f9090c5d9" containerName="nova-scheduler-scheduler" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.464807 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7762644e-6f15-46ba-82dc-de5a50a63f5b" containerName="nova-metadata-log" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.464824 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b410813-5ad4-464d-a144-34501ae862c1" containerName="nova-manage" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.464834 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7762644e-6f15-46ba-82dc-de5a50a63f5b" containerName="nova-metadata-metadata" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.466059 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.468865 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.469078 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.469783 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.473062 4749 scope.go:117] "RemoveContainer" containerID="1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950" Oct 01 13:28:45 crc kubenswrapper[4749]: E1001 13:28:45.473486 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950\": container with ID starting with 1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950 not found: ID does not exist" containerID="1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.473516 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950"} err="failed to get container status \"1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950\": rpc error: code = NotFound desc = could not find container \"1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950\": container with ID starting with 1acc4eaa4f688c071041edf8f059f36cac365a8c80f5ec7f84a5ae4619ee9950 not found: ID does not exist" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.473540 4749 scope.go:117] "RemoveContainer" containerID="32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c" Oct 01 13:28:45 crc kubenswrapper[4749]: E1001 13:28:45.473770 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c\": container with ID starting with 32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c not found: ID does not exist" containerID="32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.473798 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c"} err="failed to get container status \"32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c\": rpc error: code = NotFound desc = could not find container \"32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c\": container with ID starting with 32fbe947efb6196b290f6a88ea8de3825ca90fd5dcecdcdce6f1f6b6b6ec5f2c not found: ID does not exist" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.498111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-config-data\") pod \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\" (UID: \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\") " Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.498330 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-combined-ca-bundle\") pod \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\" (UID: \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\") " Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.498399 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbmls\" (UniqueName: \"kubernetes.io/projected/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-kube-api-access-cbmls\") pod \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\" (UID: \"f5667a94-4bef-4b2c-8bc3-274f9090c5d9\") " Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.498514 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-config-data\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.498554 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-logs\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.498574 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.498678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngmcr\" (UniqueName: \"kubernetes.io/projected/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-kube-api-access-ngmcr\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.498713 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.510843 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-kube-api-access-cbmls" (OuterVolumeSpecName: "kube-api-access-cbmls") pod "f5667a94-4bef-4b2c-8bc3-274f9090c5d9" (UID: "f5667a94-4bef-4b2c-8bc3-274f9090c5d9"). InnerVolumeSpecName "kube-api-access-cbmls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.534687 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-config-data" (OuterVolumeSpecName: "config-data") pod "f5667a94-4bef-4b2c-8bc3-274f9090c5d9" (UID: "f5667a94-4bef-4b2c-8bc3-274f9090c5d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.542069 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5667a94-4bef-4b2c-8bc3-274f9090c5d9" (UID: "f5667a94-4bef-4b2c-8bc3-274f9090c5d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.600797 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngmcr\" (UniqueName: \"kubernetes.io/projected/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-kube-api-access-ngmcr\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.600873 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.600900 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-config-data\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.600950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-logs\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.600972 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.601023 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbmls\" (UniqueName: \"kubernetes.io/projected/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-kube-api-access-cbmls\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.601034 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.601043 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5667a94-4bef-4b2c-8bc3-274f9090c5d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.602180 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-logs\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.604732 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.606722 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.607456 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-config-data\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.620377 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngmcr\" (UniqueName: \"kubernetes.io/projected/2c53c696-a24d-4024-86dc-2ce22e1a2e8e-kube-api-access-ngmcr\") pod \"nova-metadata-0\" (UID: \"2c53c696-a24d-4024-86dc-2ce22e1a2e8e\") " pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.634293 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.704270 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh5hw\" (UniqueName: \"kubernetes.io/projected/67cc6737-a83a-4812-8f6c-6c62a756676c-kube-api-access-wh5hw\") pod \"67cc6737-a83a-4812-8f6c-6c62a756676c\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.704321 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-internal-tls-certs\") pod \"67cc6737-a83a-4812-8f6c-6c62a756676c\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.704399 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-combined-ca-bundle\") pod \"67cc6737-a83a-4812-8f6c-6c62a756676c\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.704607 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-public-tls-certs\") pod \"67cc6737-a83a-4812-8f6c-6c62a756676c\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.704740 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-config-data\") pod \"67cc6737-a83a-4812-8f6c-6c62a756676c\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.704815 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67cc6737-a83a-4812-8f6c-6c62a756676c-logs\") pod \"67cc6737-a83a-4812-8f6c-6c62a756676c\" (UID: \"67cc6737-a83a-4812-8f6c-6c62a756676c\") " Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.705729 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67cc6737-a83a-4812-8f6c-6c62a756676c-logs" (OuterVolumeSpecName: "logs") pod "67cc6737-a83a-4812-8f6c-6c62a756676c" (UID: "67cc6737-a83a-4812-8f6c-6c62a756676c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.737735 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67cc6737-a83a-4812-8f6c-6c62a756676c-kube-api-access-wh5hw" (OuterVolumeSpecName: "kube-api-access-wh5hw") pod "67cc6737-a83a-4812-8f6c-6c62a756676c" (UID: "67cc6737-a83a-4812-8f6c-6c62a756676c"). InnerVolumeSpecName "kube-api-access-wh5hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.787758 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.791563 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-config-data" (OuterVolumeSpecName: "config-data") pod "67cc6737-a83a-4812-8f6c-6c62a756676c" (UID: "67cc6737-a83a-4812-8f6c-6c62a756676c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.795335 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67cc6737-a83a-4812-8f6c-6c62a756676c" (UID: "67cc6737-a83a-4812-8f6c-6c62a756676c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.807967 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.807994 4749 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67cc6737-a83a-4812-8f6c-6c62a756676c-logs\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.808006 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh5hw\" (UniqueName: \"kubernetes.io/projected/67cc6737-a83a-4812-8f6c-6c62a756676c-kube-api-access-wh5hw\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.808015 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.869521 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "67cc6737-a83a-4812-8f6c-6c62a756676c" (UID: "67cc6737-a83a-4812-8f6c-6c62a756676c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.902068 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "67cc6737-a83a-4812-8f6c-6c62a756676c" (UID: "67cc6737-a83a-4812-8f6c-6c62a756676c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.910549 4749 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:45 crc kubenswrapper[4749]: I1001 13:28:45.910601 4749 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67cc6737-a83a-4812-8f6c-6c62a756676c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 13:28:46 crc kubenswrapper[4749]: W1001 13:28:46.330063 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c53c696_a24d_4024_86dc_2ce22e1a2e8e.slice/crio-c5dd0ce430707607e134e0735999f1740aa86f439eb1316e9f46d6aa7ae08507 WatchSource:0}: Error finding container c5dd0ce430707607e134e0735999f1740aa86f439eb1316e9f46d6aa7ae08507: Status 404 returned error can't find the container with id c5dd0ce430707607e134e0735999f1740aa86f439eb1316e9f46d6aa7ae08507 Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.342004 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.382580 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67cc6737-a83a-4812-8f6c-6c62a756676c","Type":"ContainerDied","Data":"ed7e6f8bc0aca2593410aa95e42501abe4a9153d0934ee70b1d0961cbe274ea7"} Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.382624 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.382634 4749 scope.go:117] "RemoveContainer" containerID="0830108309a6537f583a1496bb71b49d00ca0ebfa3d63b0bfa4ab1527d5847b8" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.386142 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5667a94-4bef-4b2c-8bc3-274f9090c5d9","Type":"ContainerDied","Data":"03765646460aec5ea4a4f3e6c583deff142ef3c56fe6770274a983a044c7d4b9"} Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.386443 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.392984 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2c53c696-a24d-4024-86dc-2ce22e1a2e8e","Type":"ContainerStarted","Data":"c5dd0ce430707607e134e0735999f1740aa86f439eb1316e9f46d6aa7ae08507"} Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.447100 4749 scope.go:117] "RemoveContainer" containerID="7b5e15ca7b9ecee46e513c989f2b0c6411d304b27e0a9a869ea5b3e5f591e6af" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.484604 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.489309 4749 scope.go:117] "RemoveContainer" containerID="2ad1aa5f539d66ddc4091b6edb29c82ae536a8178e4fe4ec2eeb18ac21401f71" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.495387 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.515942 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.534777 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.547973 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:46 crc kubenswrapper[4749]: E1001 13:28:46.548916 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67cc6737-a83a-4812-8f6c-6c62a756676c" containerName="nova-api-api" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.549076 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="67cc6737-a83a-4812-8f6c-6c62a756676c" containerName="nova-api-api" Oct 01 13:28:46 crc kubenswrapper[4749]: E1001 13:28:46.549253 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67cc6737-a83a-4812-8f6c-6c62a756676c" containerName="nova-api-log" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.549355 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="67cc6737-a83a-4812-8f6c-6c62a756676c" containerName="nova-api-log" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.549803 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="67cc6737-a83a-4812-8f6c-6c62a756676c" containerName="nova-api-log" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.549924 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="67cc6737-a83a-4812-8f6c-6c62a756676c" containerName="nova-api-api" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.551809 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.559539 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.559802 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.559821 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.565949 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.567291 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.572196 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.579852 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.595615 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.624693 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1097258f-f21a-4b28-935a-d7dea1d508dd-config-data\") pod \"nova-scheduler-0\" (UID: \"1097258f-f21a-4b28-935a-d7dea1d508dd\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.624942 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90086f57-f0d4-4a80-9606-d225410b66e2-logs\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.624991 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90086f57-f0d4-4a80-9606-d225410b66e2-config-data\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.625020 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n595\" (UniqueName: \"kubernetes.io/projected/1097258f-f21a-4b28-935a-d7dea1d508dd-kube-api-access-6n595\") pod \"nova-scheduler-0\" (UID: \"1097258f-f21a-4b28-935a-d7dea1d508dd\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.625075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90086f57-f0d4-4a80-9606-d225410b66e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.625122 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tht9c\" (UniqueName: \"kubernetes.io/projected/90086f57-f0d4-4a80-9606-d225410b66e2-kube-api-access-tht9c\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.625147 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90086f57-f0d4-4a80-9606-d225410b66e2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.625202 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90086f57-f0d4-4a80-9606-d225410b66e2-public-tls-certs\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.625240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1097258f-f21a-4b28-935a-d7dea1d508dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1097258f-f21a-4b28-935a-d7dea1d508dd\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.726962 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90086f57-f0d4-4a80-9606-d225410b66e2-config-data\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.727022 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n595\" (UniqueName: \"kubernetes.io/projected/1097258f-f21a-4b28-935a-d7dea1d508dd-kube-api-access-6n595\") pod \"nova-scheduler-0\" (UID: \"1097258f-f21a-4b28-935a-d7dea1d508dd\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.727072 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90086f57-f0d4-4a80-9606-d225410b66e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.727125 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tht9c\" (UniqueName: \"kubernetes.io/projected/90086f57-f0d4-4a80-9606-d225410b66e2-kube-api-access-tht9c\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.727157 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90086f57-f0d4-4a80-9606-d225410b66e2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.727248 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90086f57-f0d4-4a80-9606-d225410b66e2-public-tls-certs\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.727281 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1097258f-f21a-4b28-935a-d7dea1d508dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1097258f-f21a-4b28-935a-d7dea1d508dd\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.727324 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1097258f-f21a-4b28-935a-d7dea1d508dd-config-data\") pod \"nova-scheduler-0\" (UID: \"1097258f-f21a-4b28-935a-d7dea1d508dd\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.727351 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90086f57-f0d4-4a80-9606-d225410b66e2-logs\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.728285 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90086f57-f0d4-4a80-9606-d225410b66e2-logs\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.733463 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/90086f57-f0d4-4a80-9606-d225410b66e2-public-tls-certs\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.734899 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90086f57-f0d4-4a80-9606-d225410b66e2-config-data\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.735618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1097258f-f21a-4b28-935a-d7dea1d508dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1097258f-f21a-4b28-935a-d7dea1d508dd\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.737892 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90086f57-f0d4-4a80-9606-d225410b66e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.738043 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1097258f-f21a-4b28-935a-d7dea1d508dd-config-data\") pod \"nova-scheduler-0\" (UID: \"1097258f-f21a-4b28-935a-d7dea1d508dd\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.738180 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/90086f57-f0d4-4a80-9606-d225410b66e2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.745442 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tht9c\" (UniqueName: \"kubernetes.io/projected/90086f57-f0d4-4a80-9606-d225410b66e2-kube-api-access-tht9c\") pod \"nova-api-0\" (UID: \"90086f57-f0d4-4a80-9606-d225410b66e2\") " pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.747975 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n595\" (UniqueName: \"kubernetes.io/projected/1097258f-f21a-4b28-935a-d7dea1d508dd-kube-api-access-6n595\") pod \"nova-scheduler-0\" (UID: \"1097258f-f21a-4b28-935a-d7dea1d508dd\") " pod="openstack/nova-scheduler-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.876421 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 13:28:46 crc kubenswrapper[4749]: I1001 13:28:46.889892 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 13:28:47 crc kubenswrapper[4749]: I1001 13:28:47.252129 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67cc6737-a83a-4812-8f6c-6c62a756676c" path="/var/lib/kubelet/pods/67cc6737-a83a-4812-8f6c-6c62a756676c/volumes" Oct 01 13:28:47 crc kubenswrapper[4749]: I1001 13:28:47.253851 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7762644e-6f15-46ba-82dc-de5a50a63f5b" path="/var/lib/kubelet/pods/7762644e-6f15-46ba-82dc-de5a50a63f5b/volumes" Oct 01 13:28:47 crc kubenswrapper[4749]: I1001 13:28:47.254999 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5667a94-4bef-4b2c-8bc3-274f9090c5d9" path="/var/lib/kubelet/pods/f5667a94-4bef-4b2c-8bc3-274f9090c5d9/volumes" Oct 01 13:28:47 crc kubenswrapper[4749]: I1001 13:28:47.341590 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 13:28:47 crc kubenswrapper[4749]: W1001 13:28:47.361067 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90086f57_f0d4_4a80_9606_d225410b66e2.slice/crio-0dbcde81f92db0757a9ec5e61e58ea1fd5661665bf44947b0af7a9576ec3518a WatchSource:0}: Error finding container 0dbcde81f92db0757a9ec5e61e58ea1fd5661665bf44947b0af7a9576ec3518a: Status 404 returned error can't find the container with id 0dbcde81f92db0757a9ec5e61e58ea1fd5661665bf44947b0af7a9576ec3518a Oct 01 13:28:47 crc kubenswrapper[4749]: I1001 13:28:47.406311 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90086f57-f0d4-4a80-9606-d225410b66e2","Type":"ContainerStarted","Data":"0dbcde81f92db0757a9ec5e61e58ea1fd5661665bf44947b0af7a9576ec3518a"} Oct 01 13:28:47 crc kubenswrapper[4749]: I1001 13:28:47.412010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2c53c696-a24d-4024-86dc-2ce22e1a2e8e","Type":"ContainerStarted","Data":"14f06c065c98dda00b281291fa3e35e48e15042b765684a5652579873f29c9e8"} Oct 01 13:28:47 crc kubenswrapper[4749]: I1001 13:28:47.412080 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2c53c696-a24d-4024-86dc-2ce22e1a2e8e","Type":"ContainerStarted","Data":"a5727ea9e89f2369accb96163bb2053c3935fd1032fa0b5b32ea88edf05f51d7"} Oct 01 13:28:47 crc kubenswrapper[4749]: I1001 13:28:47.451634 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.4516096 podStartE2EDuration="2.4516096s" podCreationTimestamp="2025-10-01 13:28:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:28:47.435892555 +0000 UTC m=+1387.489877484" watchObservedRunningTime="2025-10-01 13:28:47.4516096 +0000 UTC m=+1387.505594539" Oct 01 13:28:47 crc kubenswrapper[4749]: I1001 13:28:47.472837 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 13:28:47 crc kubenswrapper[4749]: W1001 13:28:47.473856 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1097258f_f21a_4b28_935a_d7dea1d508dd.slice/crio-d2af28d9c74a287d96ae0c503a0d3b4bb7b5604ee5530ea5b1b4f760c61f32af WatchSource:0}: Error finding container d2af28d9c74a287d96ae0c503a0d3b4bb7b5604ee5530ea5b1b4f760c61f32af: Status 404 returned error can't find the container with id d2af28d9c74a287d96ae0c503a0d3b4bb7b5604ee5530ea5b1b4f760c61f32af Oct 01 13:28:48 crc kubenswrapper[4749]: I1001 13:28:48.428445 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1097258f-f21a-4b28-935a-d7dea1d508dd","Type":"ContainerStarted","Data":"a8672463800d0bbb4f8beb4ce145626daab3b620babb1732d9bd1e3b4ce29ba4"} Oct 01 13:28:48 crc kubenswrapper[4749]: I1001 13:28:48.428762 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1097258f-f21a-4b28-935a-d7dea1d508dd","Type":"ContainerStarted","Data":"d2af28d9c74a287d96ae0c503a0d3b4bb7b5604ee5530ea5b1b4f760c61f32af"} Oct 01 13:28:48 crc kubenswrapper[4749]: I1001 13:28:48.434616 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90086f57-f0d4-4a80-9606-d225410b66e2","Type":"ContainerStarted","Data":"f21a48b9fa58c86436c061f5dacdcdd1a6c13cff6a8ba798ce3109777cf4e624"} Oct 01 13:28:48 crc kubenswrapper[4749]: I1001 13:28:48.434662 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"90086f57-f0d4-4a80-9606-d225410b66e2","Type":"ContainerStarted","Data":"55bf1eda14547c281d80de0a469d4bba836ebca8c442c5ffda168e5067f6e0da"} Oct 01 13:28:48 crc kubenswrapper[4749]: I1001 13:28:48.459175 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.459150942 podStartE2EDuration="2.459150942s" podCreationTimestamp="2025-10-01 13:28:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:28:48.451967074 +0000 UTC m=+1388.505952003" watchObservedRunningTime="2025-10-01 13:28:48.459150942 +0000 UTC m=+1388.513135841" Oct 01 13:28:48 crc kubenswrapper[4749]: I1001 13:28:48.482051 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.482029224 podStartE2EDuration="2.482029224s" podCreationTimestamp="2025-10-01 13:28:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:28:48.471259533 +0000 UTC m=+1388.525244452" watchObservedRunningTime="2025-10-01 13:28:48.482029224 +0000 UTC m=+1388.536014143" Oct 01 13:28:50 crc kubenswrapper[4749]: I1001 13:28:50.788777 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 13:28:50 crc kubenswrapper[4749]: I1001 13:28:50.789373 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 13:28:51 crc kubenswrapper[4749]: I1001 13:28:51.890614 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 13:28:55 crc kubenswrapper[4749]: I1001 13:28:55.788411 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 13:28:55 crc kubenswrapper[4749]: I1001 13:28:55.788909 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 13:28:56 crc kubenswrapper[4749]: I1001 13:28:56.804629 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2c53c696-a24d-4024-86dc-2ce22e1a2e8e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:28:56 crc kubenswrapper[4749]: I1001 13:28:56.804674 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2c53c696-a24d-4024-86dc-2ce22e1a2e8e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:28:56 crc kubenswrapper[4749]: I1001 13:28:56.877010 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:28:56 crc kubenswrapper[4749]: I1001 13:28:56.877060 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 13:28:56 crc kubenswrapper[4749]: I1001 13:28:56.891097 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 13:28:56 crc kubenswrapper[4749]: I1001 13:28:56.927102 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 13:28:57 crc kubenswrapper[4749]: I1001 13:28:57.589483 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 13:28:57 crc kubenswrapper[4749]: I1001 13:28:57.892366 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="90086f57-f0d4-4a80-9606-d225410b66e2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:28:57 crc kubenswrapper[4749]: I1001 13:28:57.892401 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="90086f57-f0d4-4a80-9606-d225410b66e2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 13:28:58 crc kubenswrapper[4749]: I1001 13:28:58.899580 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 13:29:05 crc kubenswrapper[4749]: I1001 13:29:05.795992 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 13:29:05 crc kubenswrapper[4749]: I1001 13:29:05.796837 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 13:29:05 crc kubenswrapper[4749]: I1001 13:29:05.802011 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 13:29:05 crc kubenswrapper[4749]: I1001 13:29:05.809982 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 13:29:06 crc kubenswrapper[4749]: I1001 13:29:06.890521 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 13:29:06 crc kubenswrapper[4749]: I1001 13:29:06.891071 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 13:29:06 crc kubenswrapper[4749]: I1001 13:29:06.902912 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 13:29:06 crc kubenswrapper[4749]: I1001 13:29:06.915346 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 13:29:07 crc kubenswrapper[4749]: I1001 13:29:07.654387 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 13:29:07 crc kubenswrapper[4749]: I1001 13:29:07.664377 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.091342 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h4bjl"] Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.095937 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.104012 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4bjl"] Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.202506 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90520931-0b89-488b-b072-7c9a807717f1-utilities\") pod \"redhat-operators-h4bjl\" (UID: \"90520931-0b89-488b-b072-7c9a807717f1\") " pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.202557 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90520931-0b89-488b-b072-7c9a807717f1-catalog-content\") pod \"redhat-operators-h4bjl\" (UID: \"90520931-0b89-488b-b072-7c9a807717f1\") " pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.202801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw7gq\" (UniqueName: \"kubernetes.io/projected/90520931-0b89-488b-b072-7c9a807717f1-kube-api-access-vw7gq\") pod \"redhat-operators-h4bjl\" (UID: \"90520931-0b89-488b-b072-7c9a807717f1\") " pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.304396 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw7gq\" (UniqueName: \"kubernetes.io/projected/90520931-0b89-488b-b072-7c9a807717f1-kube-api-access-vw7gq\") pod \"redhat-operators-h4bjl\" (UID: \"90520931-0b89-488b-b072-7c9a807717f1\") " pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.304572 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90520931-0b89-488b-b072-7c9a807717f1-utilities\") pod \"redhat-operators-h4bjl\" (UID: \"90520931-0b89-488b-b072-7c9a807717f1\") " pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.304592 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90520931-0b89-488b-b072-7c9a807717f1-catalog-content\") pod \"redhat-operators-h4bjl\" (UID: \"90520931-0b89-488b-b072-7c9a807717f1\") " pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.304997 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90520931-0b89-488b-b072-7c9a807717f1-utilities\") pod \"redhat-operators-h4bjl\" (UID: \"90520931-0b89-488b-b072-7c9a807717f1\") " pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.305037 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90520931-0b89-488b-b072-7c9a807717f1-catalog-content\") pod \"redhat-operators-h4bjl\" (UID: \"90520931-0b89-488b-b072-7c9a807717f1\") " pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.325307 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw7gq\" (UniqueName: \"kubernetes.io/projected/90520931-0b89-488b-b072-7c9a807717f1-kube-api-access-vw7gq\") pod \"redhat-operators-h4bjl\" (UID: \"90520931-0b89-488b-b072-7c9a807717f1\") " pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.419571 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:08 crc kubenswrapper[4749]: I1001 13:29:08.908149 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4bjl"] Oct 01 13:29:09 crc kubenswrapper[4749]: I1001 13:29:09.689523 4749 generic.go:334] "Generic (PLEG): container finished" podID="90520931-0b89-488b-b072-7c9a807717f1" containerID="d0e4959a28d8c12c6151ea71bbc7a522dab9c24b19f9274cad39d6dccf8c33e5" exitCode=0 Oct 01 13:29:09 crc kubenswrapper[4749]: I1001 13:29:09.689617 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4bjl" event={"ID":"90520931-0b89-488b-b072-7c9a807717f1","Type":"ContainerDied","Data":"d0e4959a28d8c12c6151ea71bbc7a522dab9c24b19f9274cad39d6dccf8c33e5"} Oct 01 13:29:09 crc kubenswrapper[4749]: I1001 13:29:09.689835 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4bjl" event={"ID":"90520931-0b89-488b-b072-7c9a807717f1","Type":"ContainerStarted","Data":"6a3a99387cc0568c1be5c957ebc19f73c5251abb4e5303d74327590d4b202cb6"} Oct 01 13:29:11 crc kubenswrapper[4749]: I1001 13:29:11.714766 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4bjl" event={"ID":"90520931-0b89-488b-b072-7c9a807717f1","Type":"ContainerStarted","Data":"b5159a5542a95660d4d52602fac4653c3512d8aa8a0cd37e832cba55d9e25adf"} Oct 01 13:29:12 crc kubenswrapper[4749]: I1001 13:29:12.731598 4749 generic.go:334] "Generic (PLEG): container finished" podID="90520931-0b89-488b-b072-7c9a807717f1" containerID="b5159a5542a95660d4d52602fac4653c3512d8aa8a0cd37e832cba55d9e25adf" exitCode=0 Oct 01 13:29:12 crc kubenswrapper[4749]: I1001 13:29:12.731718 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4bjl" event={"ID":"90520931-0b89-488b-b072-7c9a807717f1","Type":"ContainerDied","Data":"b5159a5542a95660d4d52602fac4653c3512d8aa8a0cd37e832cba55d9e25adf"} Oct 01 13:29:15 crc kubenswrapper[4749]: I1001 13:29:15.791741 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:29:16 crc kubenswrapper[4749]: I1001 13:29:16.709600 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:29:16 crc kubenswrapper[4749]: I1001 13:29:16.768823 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4bjl" event={"ID":"90520931-0b89-488b-b072-7c9a807717f1","Type":"ContainerStarted","Data":"9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4"} Oct 01 13:29:16 crc kubenswrapper[4749]: I1001 13:29:16.787374 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h4bjl" podStartSLOduration=2.550157564 podStartE2EDuration="8.787351972s" podCreationTimestamp="2025-10-01 13:29:08 +0000 UTC" firstStartedPulling="2025-10-01 13:29:09.692931356 +0000 UTC m=+1409.746916255" lastFinishedPulling="2025-10-01 13:29:15.930125764 +0000 UTC m=+1415.984110663" observedRunningTime="2025-10-01 13:29:16.784111388 +0000 UTC m=+1416.838096287" watchObservedRunningTime="2025-10-01 13:29:16.787351972 +0000 UTC m=+1416.841336871" Oct 01 13:29:18 crc kubenswrapper[4749]: I1001 13:29:18.420320 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:18 crc kubenswrapper[4749]: I1001 13:29:18.420646 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:19 crc kubenswrapper[4749]: I1001 13:29:19.469344 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h4bjl" podUID="90520931-0b89-488b-b072-7c9a807717f1" containerName="registry-server" probeResult="failure" output=< Oct 01 13:29:19 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Oct 01 13:29:19 crc kubenswrapper[4749]: > Oct 01 13:29:19 crc kubenswrapper[4749]: I1001 13:29:19.581417 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="89621c5c-1d46-44be-852f-1a37dccf02e9" containerName="rabbitmq" containerID="cri-o://c42ac4cb417747efda2345fa343f1c6c20a214153c3b3843b66e4e747ae8e5f7" gracePeriod=604797 Oct 01 13:29:19 crc kubenswrapper[4749]: I1001 13:29:19.818277 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="89621c5c-1d46-44be-852f-1a37dccf02e9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Oct 01 13:29:20 crc kubenswrapper[4749]: I1001 13:29:20.256535 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="35e1759f-e27a-4891-9fc0-37753b25689d" containerName="rabbitmq" containerID="cri-o://8499483ffc6abaf600fe62853a84bef67de5586a9a9977d0c6e190fce38d33da" gracePeriod=604797 Oct 01 13:29:20 crc kubenswrapper[4749]: I1001 13:29:20.807932 4749 generic.go:334] "Generic (PLEG): container finished" podID="89621c5c-1d46-44be-852f-1a37dccf02e9" containerID="c42ac4cb417747efda2345fa343f1c6c20a214153c3b3843b66e4e747ae8e5f7" exitCode=0 Oct 01 13:29:20 crc kubenswrapper[4749]: I1001 13:29:20.807981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89621c5c-1d46-44be-852f-1a37dccf02e9","Type":"ContainerDied","Data":"c42ac4cb417747efda2345fa343f1c6c20a214153c3b3843b66e4e747ae8e5f7"} Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.237823 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.291994 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-config-data\") pod \"89621c5c-1d46-44be-852f-1a37dccf02e9\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.292182 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-plugins\") pod \"89621c5c-1d46-44be-852f-1a37dccf02e9\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.292760 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "89621c5c-1d46-44be-852f-1a37dccf02e9" (UID: "89621c5c-1d46-44be-852f-1a37dccf02e9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.292856 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89621c5c-1d46-44be-852f-1a37dccf02e9-erlang-cookie-secret\") pod \"89621c5c-1d46-44be-852f-1a37dccf02e9\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.293148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-server-conf\") pod \"89621c5c-1d46-44be-852f-1a37dccf02e9\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.293205 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"89621c5c-1d46-44be-852f-1a37dccf02e9\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.293334 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-tls\") pod \"89621c5c-1d46-44be-852f-1a37dccf02e9\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.293462 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-erlang-cookie\") pod \"89621c5c-1d46-44be-852f-1a37dccf02e9\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.293502 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-plugins-conf\") pod \"89621c5c-1d46-44be-852f-1a37dccf02e9\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.293548 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rdjp\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-kube-api-access-7rdjp\") pod \"89621c5c-1d46-44be-852f-1a37dccf02e9\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.293611 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-confd\") pod \"89621c5c-1d46-44be-852f-1a37dccf02e9\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.293647 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89621c5c-1d46-44be-852f-1a37dccf02e9-pod-info\") pod \"89621c5c-1d46-44be-852f-1a37dccf02e9\" (UID: \"89621c5c-1d46-44be-852f-1a37dccf02e9\") " Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.294379 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.294828 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "89621c5c-1d46-44be-852f-1a37dccf02e9" (UID: "89621c5c-1d46-44be-852f-1a37dccf02e9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.295802 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "89621c5c-1d46-44be-852f-1a37dccf02e9" (UID: "89621c5c-1d46-44be-852f-1a37dccf02e9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.299788 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "89621c5c-1d46-44be-852f-1a37dccf02e9" (UID: "89621c5c-1d46-44be-852f-1a37dccf02e9"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.300369 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "89621c5c-1d46-44be-852f-1a37dccf02e9" (UID: "89621c5c-1d46-44be-852f-1a37dccf02e9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.306740 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/89621c5c-1d46-44be-852f-1a37dccf02e9-pod-info" (OuterVolumeSpecName: "pod-info") pod "89621c5c-1d46-44be-852f-1a37dccf02e9" (UID: "89621c5c-1d46-44be-852f-1a37dccf02e9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.312713 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89621c5c-1d46-44be-852f-1a37dccf02e9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "89621c5c-1d46-44be-852f-1a37dccf02e9" (UID: "89621c5c-1d46-44be-852f-1a37dccf02e9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.338993 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-kube-api-access-7rdjp" (OuterVolumeSpecName: "kube-api-access-7rdjp") pod "89621c5c-1d46-44be-852f-1a37dccf02e9" (UID: "89621c5c-1d46-44be-852f-1a37dccf02e9"). InnerVolumeSpecName "kube-api-access-7rdjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.342369 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-config-data" (OuterVolumeSpecName: "config-data") pod "89621c5c-1d46-44be-852f-1a37dccf02e9" (UID: "89621c5c-1d46-44be-852f-1a37dccf02e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.396769 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89621c5c-1d46-44be-852f-1a37dccf02e9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.396819 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.396830 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.396841 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.396853 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.396862 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rdjp\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-kube-api-access-7rdjp\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.396871 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89621c5c-1d46-44be-852f-1a37dccf02e9-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.396879 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.428731 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.434808 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-server-conf" (OuterVolumeSpecName: "server-conf") pod "89621c5c-1d46-44be-852f-1a37dccf02e9" (UID: "89621c5c-1d46-44be-852f-1a37dccf02e9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.501635 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89621c5c-1d46-44be-852f-1a37dccf02e9-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.501665 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.514597 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "89621c5c-1d46-44be-852f-1a37dccf02e9" (UID: "89621c5c-1d46-44be-852f-1a37dccf02e9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.609464 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89621c5c-1d46-44be-852f-1a37dccf02e9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.826659 4749 generic.go:334] "Generic (PLEG): container finished" podID="35e1759f-e27a-4891-9fc0-37753b25689d" containerID="8499483ffc6abaf600fe62853a84bef67de5586a9a9977d0c6e190fce38d33da" exitCode=0 Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.826734 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35e1759f-e27a-4891-9fc0-37753b25689d","Type":"ContainerDied","Data":"8499483ffc6abaf600fe62853a84bef67de5586a9a9977d0c6e190fce38d33da"} Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.836248 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"89621c5c-1d46-44be-852f-1a37dccf02e9","Type":"ContainerDied","Data":"df2500bae23962e3728d43cfc79e984492a602cd453ccb3fe88cf2078b2e10e7"} Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.836300 4749 scope.go:117] "RemoveContainer" containerID="c42ac4cb417747efda2345fa343f1c6c20a214153c3b3843b66e4e747ae8e5f7" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.836547 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.954411 4749 scope.go:117] "RemoveContainer" containerID="6432798b0d8b723cf7c554daa2ac158cb3aa44a8340879d92c784631824689c2" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.955300 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.972456 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:29:21 crc kubenswrapper[4749]: I1001 13:29:21.981080 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.020599 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-server-conf\") pod \"35e1759f-e27a-4891-9fc0-37753b25689d\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.020679 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35e1759f-e27a-4891-9fc0-37753b25689d-pod-info\") pod \"35e1759f-e27a-4891-9fc0-37753b25689d\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.020716 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-erlang-cookie\") pod \"35e1759f-e27a-4891-9fc0-37753b25689d\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.020741 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"35e1759f-e27a-4891-9fc0-37753b25689d\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.020788 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-confd\") pod \"35e1759f-e27a-4891-9fc0-37753b25689d\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.020888 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-plugins-conf\") pod \"35e1759f-e27a-4891-9fc0-37753b25689d\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.020930 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-tls\") pod \"35e1759f-e27a-4891-9fc0-37753b25689d\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.021053 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-plugins\") pod \"35e1759f-e27a-4891-9fc0-37753b25689d\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.021104 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-config-data\") pod \"35e1759f-e27a-4891-9fc0-37753b25689d\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.021151 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35e1759f-e27a-4891-9fc0-37753b25689d-erlang-cookie-secret\") pod \"35e1759f-e27a-4891-9fc0-37753b25689d\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.021230 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxhgw\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-kube-api-access-gxhgw\") pod \"35e1759f-e27a-4891-9fc0-37753b25689d\" (UID: \"35e1759f-e27a-4891-9fc0-37753b25689d\") " Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.031989 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "35e1759f-e27a-4891-9fc0-37753b25689d" (UID: "35e1759f-e27a-4891-9fc0-37753b25689d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.032573 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "35e1759f-e27a-4891-9fc0-37753b25689d" (UID: "35e1759f-e27a-4891-9fc0-37753b25689d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.033288 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "35e1759f-e27a-4891-9fc0-37753b25689d" (UID: "35e1759f-e27a-4891-9fc0-37753b25689d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.041287 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:29:22 crc kubenswrapper[4749]: E1001 13:29:22.041791 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89621c5c-1d46-44be-852f-1a37dccf02e9" containerName="setup-container" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.041819 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="89621c5c-1d46-44be-852f-1a37dccf02e9" containerName="setup-container" Oct 01 13:29:22 crc kubenswrapper[4749]: E1001 13:29:22.041849 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e1759f-e27a-4891-9fc0-37753b25689d" containerName="setup-container" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.041858 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e1759f-e27a-4891-9fc0-37753b25689d" containerName="setup-container" Oct 01 13:29:22 crc kubenswrapper[4749]: E1001 13:29:22.041870 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e1759f-e27a-4891-9fc0-37753b25689d" containerName="rabbitmq" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.041878 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e1759f-e27a-4891-9fc0-37753b25689d" containerName="rabbitmq" Oct 01 13:29:22 crc kubenswrapper[4749]: E1001 13:29:22.041922 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89621c5c-1d46-44be-852f-1a37dccf02e9" containerName="rabbitmq" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.041932 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="89621c5c-1d46-44be-852f-1a37dccf02e9" containerName="rabbitmq" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.042176 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e1759f-e27a-4891-9fc0-37753b25689d" containerName="rabbitmq" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.042237 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="89621c5c-1d46-44be-852f-1a37dccf02e9" containerName="rabbitmq" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.043594 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e1759f-e27a-4891-9fc0-37753b25689d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "35e1759f-e27a-4891-9fc0-37753b25689d" (UID: "35e1759f-e27a-4891-9fc0-37753b25689d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.050472 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "35e1759f-e27a-4891-9fc0-37753b25689d" (UID: "35e1759f-e27a-4891-9fc0-37753b25689d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.051668 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "35e1759f-e27a-4891-9fc0-37753b25689d" (UID: "35e1759f-e27a-4891-9fc0-37753b25689d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.058316 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-kube-api-access-gxhgw" (OuterVolumeSpecName: "kube-api-access-gxhgw") pod "35e1759f-e27a-4891-9fc0-37753b25689d" (UID: "35e1759f-e27a-4891-9fc0-37753b25689d"). InnerVolumeSpecName "kube-api-access-gxhgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.058776 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.060804 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.062778 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-gfcg5" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.064480 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.064604 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.064615 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.064689 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.064693 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.072661 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/35e1759f-e27a-4891-9fc0-37753b25689d-pod-info" (OuterVolumeSpecName: "pod-info") pod "35e1759f-e27a-4891-9fc0-37753b25689d" (UID: "35e1759f-e27a-4891-9fc0-37753b25689d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.082915 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.124141 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b5aa915-bf5a-4046-834c-6051ed420f42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.124190 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b5aa915-bf5a-4046-834c-6051ed420f42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.124267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b5aa915-bf5a-4046-834c-6051ed420f42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.124282 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b5aa915-bf5a-4046-834c-6051ed420f42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.124302 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dsgx\" (UniqueName: \"kubernetes.io/projected/5b5aa915-bf5a-4046-834c-6051ed420f42-kube-api-access-8dsgx\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.124333 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b5aa915-bf5a-4046-834c-6051ed420f42-config-data\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.124348 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b5aa915-bf5a-4046-834c-6051ed420f42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.125761 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b5aa915-bf5a-4046-834c-6051ed420f42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.125833 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b5aa915-bf5a-4046-834c-6051ed420f42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.125895 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.126075 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b5aa915-bf5a-4046-834c-6051ed420f42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.126434 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.126454 4749 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35e1759f-e27a-4891-9fc0-37753b25689d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.126464 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxhgw\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-kube-api-access-gxhgw\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.126473 4749 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35e1759f-e27a-4891-9fc0-37753b25689d-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.126485 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.126502 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.126511 4749 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.126520 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.152643 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.163452 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-config-data" (OuterVolumeSpecName: "config-data") pod "35e1759f-e27a-4891-9fc0-37753b25689d" (UID: "35e1759f-e27a-4891-9fc0-37753b25689d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.183312 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-server-conf" (OuterVolumeSpecName: "server-conf") pod "35e1759f-e27a-4891-9fc0-37753b25689d" (UID: "35e1759f-e27a-4891-9fc0-37753b25689d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.216352 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "35e1759f-e27a-4891-9fc0-37753b25689d" (UID: "35e1759f-e27a-4891-9fc0-37753b25689d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.228990 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b5aa915-bf5a-4046-834c-6051ed420f42-config-data\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.229131 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b5aa915-bf5a-4046-834c-6051ed420f42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.229441 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b5aa915-bf5a-4046-834c-6051ed420f42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230290 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b5aa915-bf5a-4046-834c-6051ed420f42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230435 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230512 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b5aa915-bf5a-4046-834c-6051ed420f42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230650 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b5aa915-bf5a-4046-834c-6051ed420f42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230719 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b5aa915-bf5a-4046-834c-6051ed420f42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230786 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b5aa915-bf5a-4046-834c-6051ed420f42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230820 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b5aa915-bf5a-4046-834c-6051ed420f42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230851 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dsgx\" (UniqueName: \"kubernetes.io/projected/5b5aa915-bf5a-4046-834c-6051ed420f42-kube-api-access-8dsgx\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230936 4749 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230959 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230972 4749 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35e1759f-e27a-4891-9fc0-37753b25689d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230986 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35e1759f-e27a-4891-9fc0-37753b25689d-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.230988 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b5aa915-bf5a-4046-834c-6051ed420f42-config-data\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.231518 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b5aa915-bf5a-4046-834c-6051ed420f42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.232374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b5aa915-bf5a-4046-834c-6051ed420f42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.232780 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.233423 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b5aa915-bf5a-4046-834c-6051ed420f42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.234544 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b5aa915-bf5a-4046-834c-6051ed420f42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.237267 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b5aa915-bf5a-4046-834c-6051ed420f42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.242130 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b5aa915-bf5a-4046-834c-6051ed420f42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.243128 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b5aa915-bf5a-4046-834c-6051ed420f42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.249441 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b5aa915-bf5a-4046-834c-6051ed420f42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.251979 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dsgx\" (UniqueName: \"kubernetes.io/projected/5b5aa915-bf5a-4046-834c-6051ed420f42-kube-api-access-8dsgx\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.280956 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-server-0\" (UID: \"5b5aa915-bf5a-4046-834c-6051ed420f42\") " pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.453497 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.849546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35e1759f-e27a-4891-9fc0-37753b25689d","Type":"ContainerDied","Data":"f48f67800299fda7910707cb5ce3116921337ba37ef85bfbd1284727b8e01a1e"} Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.849857 4749 scope.go:117] "RemoveContainer" containerID="8499483ffc6abaf600fe62853a84bef67de5586a9a9977d0c6e190fce38d33da" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.849990 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.897748 4749 scope.go:117] "RemoveContainer" containerID="be178b2e3c636505e7559b1f4f4b2b933f257fb8e1218e28feced6d711498217" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.934765 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.950568 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.963248 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.965275 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.968035 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.968392 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8tf6x" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.968456 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.968520 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.968608 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.968617 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.968391 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 13:29:22 crc kubenswrapper[4749]: I1001 13:29:22.992840 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.007048 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.051802 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.052076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.052093 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5jhc\" (UniqueName: \"kubernetes.io/projected/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-kube-api-access-s5jhc\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.052178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.052424 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.052472 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.052538 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.052565 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.052647 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.052813 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.052884 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.154946 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155012 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155038 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5jhc\" (UniqueName: \"kubernetes.io/projected/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-kube-api-access-s5jhc\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155075 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155156 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155183 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155243 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155280 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155335 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155412 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155746 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155758 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.155961 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.156140 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.156598 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.156725 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.164089 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.164575 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.164836 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.165709 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.177242 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5jhc\" (UniqueName: \"kubernetes.io/projected/811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7-kube-api-access-s5jhc\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.200679 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.254521 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35e1759f-e27a-4891-9fc0-37753b25689d" path="/var/lib/kubelet/pods/35e1759f-e27a-4891-9fc0-37753b25689d/volumes" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.255843 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89621c5c-1d46-44be-852f-1a37dccf02e9" path="/var/lib/kubelet/pods/89621c5c-1d46-44be-852f-1a37dccf02e9/volumes" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.302904 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.762483 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 13:29:23 crc kubenswrapper[4749]: W1001 13:29:23.764275 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod811135e0_fdbb_4e6e_bd9f_13d54ba7f4f7.slice/crio-f1c0419359154958710e98848d65a924cba32463e809f4a434d9ce10b73e28c2 WatchSource:0}: Error finding container f1c0419359154958710e98848d65a924cba32463e809f4a434d9ce10b73e28c2: Status 404 returned error can't find the container with id f1c0419359154958710e98848d65a924cba32463e809f4a434d9ce10b73e28c2 Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.882841 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b5aa915-bf5a-4046-834c-6051ed420f42","Type":"ContainerStarted","Data":"3813ef529fa2e61cac8b4e13096e72ecea8b16e47e731a275bb39a8b9926c837"} Oct 01 13:29:23 crc kubenswrapper[4749]: I1001 13:29:23.894976 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7","Type":"ContainerStarted","Data":"f1c0419359154958710e98848d65a924cba32463e809f4a434d9ce10b73e28c2"} Oct 01 13:29:25 crc kubenswrapper[4749]: I1001 13:29:25.916090 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b5aa915-bf5a-4046-834c-6051ed420f42","Type":"ContainerStarted","Data":"5368ade1a927c23b68b5dd32066a35b83f41e3408e454c1e9b751f52ae1f00e3"} Oct 01 13:29:25 crc kubenswrapper[4749]: I1001 13:29:25.917439 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7","Type":"ContainerStarted","Data":"e6543fb74f3e73522f6b4c9c4a80f31d90da328b9449c79d7baef9050e3d21f1"} Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.277611 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lzfmx"] Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.280636 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.314178 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lzfmx"] Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.367461 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53163753-4229-4d98-b47d-f469b87dc8b1-utilities\") pod \"redhat-marketplace-lzfmx\" (UID: \"53163753-4229-4d98-b47d-f469b87dc8b1\") " pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.367589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53163753-4229-4d98-b47d-f469b87dc8b1-catalog-content\") pod \"redhat-marketplace-lzfmx\" (UID: \"53163753-4229-4d98-b47d-f469b87dc8b1\") " pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.367637 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz6lm\" (UniqueName: \"kubernetes.io/projected/53163753-4229-4d98-b47d-f469b87dc8b1-kube-api-access-cz6lm\") pod \"redhat-marketplace-lzfmx\" (UID: \"53163753-4229-4d98-b47d-f469b87dc8b1\") " pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.465460 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.469453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz6lm\" (UniqueName: \"kubernetes.io/projected/53163753-4229-4d98-b47d-f469b87dc8b1-kube-api-access-cz6lm\") pod \"redhat-marketplace-lzfmx\" (UID: \"53163753-4229-4d98-b47d-f469b87dc8b1\") " pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.469652 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53163753-4229-4d98-b47d-f469b87dc8b1-utilities\") pod \"redhat-marketplace-lzfmx\" (UID: \"53163753-4229-4d98-b47d-f469b87dc8b1\") " pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.469698 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53163753-4229-4d98-b47d-f469b87dc8b1-catalog-content\") pod \"redhat-marketplace-lzfmx\" (UID: \"53163753-4229-4d98-b47d-f469b87dc8b1\") " pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.470199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53163753-4229-4d98-b47d-f469b87dc8b1-catalog-content\") pod \"redhat-marketplace-lzfmx\" (UID: \"53163753-4229-4d98-b47d-f469b87dc8b1\") " pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.470335 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53163753-4229-4d98-b47d-f469b87dc8b1-utilities\") pod \"redhat-marketplace-lzfmx\" (UID: \"53163753-4229-4d98-b47d-f469b87dc8b1\") " pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.492970 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz6lm\" (UniqueName: \"kubernetes.io/projected/53163753-4229-4d98-b47d-f469b87dc8b1-kube-api-access-cz6lm\") pod \"redhat-marketplace-lzfmx\" (UID: \"53163753-4229-4d98-b47d-f469b87dc8b1\") " pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.532200 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:28 crc kubenswrapper[4749]: I1001 13:29:28.618337 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:29 crc kubenswrapper[4749]: I1001 13:29:29.094302 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lzfmx"] Oct 01 13:29:29 crc kubenswrapper[4749]: I1001 13:29:29.959810 4749 generic.go:334] "Generic (PLEG): container finished" podID="53163753-4229-4d98-b47d-f469b87dc8b1" containerID="1c933e7696a7ec1708df657095aef67fdbc3d0470e8eaa5caf2207a78580a9e9" exitCode=0 Oct 01 13:29:29 crc kubenswrapper[4749]: I1001 13:29:29.960373 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzfmx" event={"ID":"53163753-4229-4d98-b47d-f469b87dc8b1","Type":"ContainerDied","Data":"1c933e7696a7ec1708df657095aef67fdbc3d0470e8eaa5caf2207a78580a9e9"} Oct 01 13:29:29 crc kubenswrapper[4749]: I1001 13:29:29.960400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzfmx" event={"ID":"53163753-4229-4d98-b47d-f469b87dc8b1","Type":"ContainerStarted","Data":"5e2e39635d261d42775f8badd63afb278e7cf3e7ca2fd46bc058d62434dfa683"} Oct 01 13:29:30 crc kubenswrapper[4749]: I1001 13:29:30.849843 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4bjl"] Oct 01 13:29:30 crc kubenswrapper[4749]: I1001 13:29:30.850083 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h4bjl" podUID="90520931-0b89-488b-b072-7c9a807717f1" containerName="registry-server" containerID="cri-o://9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4" gracePeriod=2 Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.373129 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.429925 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90520931-0b89-488b-b072-7c9a807717f1-catalog-content\") pod \"90520931-0b89-488b-b072-7c9a807717f1\" (UID: \"90520931-0b89-488b-b072-7c9a807717f1\") " Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.430024 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw7gq\" (UniqueName: \"kubernetes.io/projected/90520931-0b89-488b-b072-7c9a807717f1-kube-api-access-vw7gq\") pod \"90520931-0b89-488b-b072-7c9a807717f1\" (UID: \"90520931-0b89-488b-b072-7c9a807717f1\") " Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.430066 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90520931-0b89-488b-b072-7c9a807717f1-utilities\") pod \"90520931-0b89-488b-b072-7c9a807717f1\" (UID: \"90520931-0b89-488b-b072-7c9a807717f1\") " Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.434931 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90520931-0b89-488b-b072-7c9a807717f1-utilities" (OuterVolumeSpecName: "utilities") pod "90520931-0b89-488b-b072-7c9a807717f1" (UID: "90520931-0b89-488b-b072-7c9a807717f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.437575 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90520931-0b89-488b-b072-7c9a807717f1-kube-api-access-vw7gq" (OuterVolumeSpecName: "kube-api-access-vw7gq") pod "90520931-0b89-488b-b072-7c9a807717f1" (UID: "90520931-0b89-488b-b072-7c9a807717f1"). InnerVolumeSpecName "kube-api-access-vw7gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.510470 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90520931-0b89-488b-b072-7c9a807717f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90520931-0b89-488b-b072-7c9a807717f1" (UID: "90520931-0b89-488b-b072-7c9a807717f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.533568 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90520931-0b89-488b-b072-7c9a807717f1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.533616 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw7gq\" (UniqueName: \"kubernetes.io/projected/90520931-0b89-488b-b072-7c9a807717f1-kube-api-access-vw7gq\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.533631 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90520931-0b89-488b-b072-7c9a807717f1-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.981760 4749 generic.go:334] "Generic (PLEG): container finished" podID="90520931-0b89-488b-b072-7c9a807717f1" containerID="9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4" exitCode=0 Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.981798 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4bjl" Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.981868 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4bjl" event={"ID":"90520931-0b89-488b-b072-7c9a807717f1","Type":"ContainerDied","Data":"9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4"} Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.981924 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4bjl" event={"ID":"90520931-0b89-488b-b072-7c9a807717f1","Type":"ContainerDied","Data":"6a3a99387cc0568c1be5c957ebc19f73c5251abb4e5303d74327590d4b202cb6"} Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.981950 4749 scope.go:117] "RemoveContainer" containerID="9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4" Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.987261 4749 generic.go:334] "Generic (PLEG): container finished" podID="53163753-4229-4d98-b47d-f469b87dc8b1" containerID="80934dcc21f3791c8dc5271476b51c58dbb762345d298dc16348f674290ee6d4" exitCode=0 Oct 01 13:29:31 crc kubenswrapper[4749]: I1001 13:29:31.987329 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzfmx" event={"ID":"53163753-4229-4d98-b47d-f469b87dc8b1","Type":"ContainerDied","Data":"80934dcc21f3791c8dc5271476b51c58dbb762345d298dc16348f674290ee6d4"} Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.019041 4749 scope.go:117] "RemoveContainer" containerID="b5159a5542a95660d4d52602fac4653c3512d8aa8a0cd37e832cba55d9e25adf" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.040898 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4bjl"] Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.054649 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h4bjl"] Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.054673 4749 scope.go:117] "RemoveContainer" containerID="d0e4959a28d8c12c6151ea71bbc7a522dab9c24b19f9274cad39d6dccf8c33e5" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.090628 4749 scope.go:117] "RemoveContainer" containerID="9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4" Oct 01 13:29:32 crc kubenswrapper[4749]: E1001 13:29:32.091157 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4\": container with ID starting with 9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4 not found: ID does not exist" containerID="9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.091194 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4"} err="failed to get container status \"9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4\": rpc error: code = NotFound desc = could not find container \"9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4\": container with ID starting with 9f63edc5cb0ed635e89a47be6cc3d87af0da66ad56dda8b63870aafca9ae1be4 not found: ID does not exist" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.091291 4749 scope.go:117] "RemoveContainer" containerID="b5159a5542a95660d4d52602fac4653c3512d8aa8a0cd37e832cba55d9e25adf" Oct 01 13:29:32 crc kubenswrapper[4749]: E1001 13:29:32.091522 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5159a5542a95660d4d52602fac4653c3512d8aa8a0cd37e832cba55d9e25adf\": container with ID starting with b5159a5542a95660d4d52602fac4653c3512d8aa8a0cd37e832cba55d9e25adf not found: ID does not exist" containerID="b5159a5542a95660d4d52602fac4653c3512d8aa8a0cd37e832cba55d9e25adf" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.091544 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5159a5542a95660d4d52602fac4653c3512d8aa8a0cd37e832cba55d9e25adf"} err="failed to get container status \"b5159a5542a95660d4d52602fac4653c3512d8aa8a0cd37e832cba55d9e25adf\": rpc error: code = NotFound desc = could not find container \"b5159a5542a95660d4d52602fac4653c3512d8aa8a0cd37e832cba55d9e25adf\": container with ID starting with b5159a5542a95660d4d52602fac4653c3512d8aa8a0cd37e832cba55d9e25adf not found: ID does not exist" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.091557 4749 scope.go:117] "RemoveContainer" containerID="d0e4959a28d8c12c6151ea71bbc7a522dab9c24b19f9274cad39d6dccf8c33e5" Oct 01 13:29:32 crc kubenswrapper[4749]: E1001 13:29:32.091685 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e4959a28d8c12c6151ea71bbc7a522dab9c24b19f9274cad39d6dccf8c33e5\": container with ID starting with d0e4959a28d8c12c6151ea71bbc7a522dab9c24b19f9274cad39d6dccf8c33e5 not found: ID does not exist" containerID="d0e4959a28d8c12c6151ea71bbc7a522dab9c24b19f9274cad39d6dccf8c33e5" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.091703 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e4959a28d8c12c6151ea71bbc7a522dab9c24b19f9274cad39d6dccf8c33e5"} err="failed to get container status \"d0e4959a28d8c12c6151ea71bbc7a522dab9c24b19f9274cad39d6dccf8c33e5\": rpc error: code = NotFound desc = could not find container \"d0e4959a28d8c12c6151ea71bbc7a522dab9c24b19f9274cad39d6dccf8c33e5\": container with ID starting with d0e4959a28d8c12c6151ea71bbc7a522dab9c24b19f9274cad39d6dccf8c33e5 not found: ID does not exist" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.253147 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5864d459-zjnc6"] Oct 01 13:29:32 crc kubenswrapper[4749]: E1001 13:29:32.253963 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90520931-0b89-488b-b072-7c9a807717f1" containerName="extract-utilities" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.253978 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="90520931-0b89-488b-b072-7c9a807717f1" containerName="extract-utilities" Oct 01 13:29:32 crc kubenswrapper[4749]: E1001 13:29:32.254006 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90520931-0b89-488b-b072-7c9a807717f1" containerName="extract-content" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.254012 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="90520931-0b89-488b-b072-7c9a807717f1" containerName="extract-content" Oct 01 13:29:32 crc kubenswrapper[4749]: E1001 13:29:32.254025 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90520931-0b89-488b-b072-7c9a807717f1" containerName="registry-server" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.254031 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="90520931-0b89-488b-b072-7c9a807717f1" containerName="registry-server" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.254250 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="90520931-0b89-488b-b072-7c9a807717f1" containerName="registry-server" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.256108 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.257874 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.268972 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5864d459-zjnc6"] Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.349726 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpngw\" (UniqueName: \"kubernetes.io/projected/2e200dd4-74df-427c-851a-8b2e993f186b-kube-api-access-zpngw\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.349801 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-dns-svc\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.349865 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.349882 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.349897 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-openstack-edpm-ipam\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.349949 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-config\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.350023 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.451743 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpngw\" (UniqueName: \"kubernetes.io/projected/2e200dd4-74df-427c-851a-8b2e993f186b-kube-api-access-zpngw\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.451806 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-dns-svc\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.451838 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.451855 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.451884 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-openstack-edpm-ipam\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.451921 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-config\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.451961 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.452953 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.453694 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.453710 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-openstack-edpm-ipam\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.453951 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.454165 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-dns-svc\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.454633 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-config\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.474890 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpngw\" (UniqueName: \"kubernetes.io/projected/2e200dd4-74df-427c-851a-8b2e993f186b-kube-api-access-zpngw\") pod \"dnsmasq-dns-5c5864d459-zjnc6\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:32 crc kubenswrapper[4749]: I1001 13:29:32.578831 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:33 crc kubenswrapper[4749]: I1001 13:29:33.003984 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzfmx" event={"ID":"53163753-4229-4d98-b47d-f469b87dc8b1","Type":"ContainerStarted","Data":"2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec"} Oct 01 13:29:33 crc kubenswrapper[4749]: I1001 13:29:33.029717 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lzfmx" podStartSLOduration=2.440854397 podStartE2EDuration="5.029701656s" podCreationTimestamp="2025-10-01 13:29:28 +0000 UTC" firstStartedPulling="2025-10-01 13:29:29.964606916 +0000 UTC m=+1430.018591815" lastFinishedPulling="2025-10-01 13:29:32.553454175 +0000 UTC m=+1432.607439074" observedRunningTime="2025-10-01 13:29:33.021330051 +0000 UTC m=+1433.075314960" watchObservedRunningTime="2025-10-01 13:29:33.029701656 +0000 UTC m=+1433.083686555" Oct 01 13:29:33 crc kubenswrapper[4749]: I1001 13:29:33.181290 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5864d459-zjnc6"] Oct 01 13:29:33 crc kubenswrapper[4749]: I1001 13:29:33.250109 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90520931-0b89-488b-b072-7c9a807717f1" path="/var/lib/kubelet/pods/90520931-0b89-488b-b072-7c9a807717f1/volumes" Oct 01 13:29:34 crc kubenswrapper[4749]: I1001 13:29:34.015019 4749 generic.go:334] "Generic (PLEG): container finished" podID="2e200dd4-74df-427c-851a-8b2e993f186b" containerID="dfabb68ecf10264557e7242ad392084e479245c3a4be8ed94328bddaedff74f7" exitCode=0 Oct 01 13:29:34 crc kubenswrapper[4749]: I1001 13:29:34.015074 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" event={"ID":"2e200dd4-74df-427c-851a-8b2e993f186b","Type":"ContainerDied","Data":"dfabb68ecf10264557e7242ad392084e479245c3a4be8ed94328bddaedff74f7"} Oct 01 13:29:34 crc kubenswrapper[4749]: I1001 13:29:34.015458 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" event={"ID":"2e200dd4-74df-427c-851a-8b2e993f186b","Type":"ContainerStarted","Data":"54015c60763edea90f938c37e0cd2a0273d4dbfa88e1e2678653fb0af4bc6cae"} Oct 01 13:29:35 crc kubenswrapper[4749]: I1001 13:29:35.028624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" event={"ID":"2e200dd4-74df-427c-851a-8b2e993f186b","Type":"ContainerStarted","Data":"3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e"} Oct 01 13:29:35 crc kubenswrapper[4749]: I1001 13:29:35.029466 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:35 crc kubenswrapper[4749]: I1001 13:29:35.061582 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" podStartSLOduration=3.061554186 podStartE2EDuration="3.061554186s" podCreationTimestamp="2025-10-01 13:29:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:29:35.046253517 +0000 UTC m=+1435.100238436" watchObservedRunningTime="2025-10-01 13:29:35.061554186 +0000 UTC m=+1435.115539135" Oct 01 13:29:38 crc kubenswrapper[4749]: I1001 13:29:38.619407 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:38 crc kubenswrapper[4749]: I1001 13:29:38.619944 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:38 crc kubenswrapper[4749]: I1001 13:29:38.672252 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:39 crc kubenswrapper[4749]: I1001 13:29:39.176965 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:39 crc kubenswrapper[4749]: I1001 13:29:39.293103 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lzfmx"] Oct 01 13:29:41 crc kubenswrapper[4749]: I1001 13:29:41.125441 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lzfmx" podUID="53163753-4229-4d98-b47d-f469b87dc8b1" containerName="registry-server" containerID="cri-o://2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec" gracePeriod=2 Oct 01 13:29:41 crc kubenswrapper[4749]: I1001 13:29:41.722272 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:41 crc kubenswrapper[4749]: I1001 13:29:41.849530 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53163753-4229-4d98-b47d-f469b87dc8b1-catalog-content\") pod \"53163753-4229-4d98-b47d-f469b87dc8b1\" (UID: \"53163753-4229-4d98-b47d-f469b87dc8b1\") " Oct 01 13:29:41 crc kubenswrapper[4749]: I1001 13:29:41.849770 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53163753-4229-4d98-b47d-f469b87dc8b1-utilities\") pod \"53163753-4229-4d98-b47d-f469b87dc8b1\" (UID: \"53163753-4229-4d98-b47d-f469b87dc8b1\") " Oct 01 13:29:41 crc kubenswrapper[4749]: I1001 13:29:41.849945 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz6lm\" (UniqueName: \"kubernetes.io/projected/53163753-4229-4d98-b47d-f469b87dc8b1-kube-api-access-cz6lm\") pod \"53163753-4229-4d98-b47d-f469b87dc8b1\" (UID: \"53163753-4229-4d98-b47d-f469b87dc8b1\") " Oct 01 13:29:41 crc kubenswrapper[4749]: I1001 13:29:41.850905 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53163753-4229-4d98-b47d-f469b87dc8b1-utilities" (OuterVolumeSpecName: "utilities") pod "53163753-4229-4d98-b47d-f469b87dc8b1" (UID: "53163753-4229-4d98-b47d-f469b87dc8b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:29:41 crc kubenswrapper[4749]: I1001 13:29:41.854888 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53163753-4229-4d98-b47d-f469b87dc8b1-kube-api-access-cz6lm" (OuterVolumeSpecName: "kube-api-access-cz6lm") pod "53163753-4229-4d98-b47d-f469b87dc8b1" (UID: "53163753-4229-4d98-b47d-f469b87dc8b1"). InnerVolumeSpecName "kube-api-access-cz6lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:41 crc kubenswrapper[4749]: I1001 13:29:41.863149 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53163753-4229-4d98-b47d-f469b87dc8b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53163753-4229-4d98-b47d-f469b87dc8b1" (UID: "53163753-4229-4d98-b47d-f469b87dc8b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:29:41 crc kubenswrapper[4749]: I1001 13:29:41.952443 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz6lm\" (UniqueName: \"kubernetes.io/projected/53163753-4229-4d98-b47d-f469b87dc8b1-kube-api-access-cz6lm\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:41 crc kubenswrapper[4749]: I1001 13:29:41.952495 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53163753-4229-4d98-b47d-f469b87dc8b1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:41 crc kubenswrapper[4749]: I1001 13:29:41.952515 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53163753-4229-4d98-b47d-f469b87dc8b1-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.137947 4749 generic.go:334] "Generic (PLEG): container finished" podID="53163753-4229-4d98-b47d-f469b87dc8b1" containerID="2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec" exitCode=0 Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.138185 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzfmx" event={"ID":"53163753-4229-4d98-b47d-f469b87dc8b1","Type":"ContainerDied","Data":"2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec"} Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.139838 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzfmx" event={"ID":"53163753-4229-4d98-b47d-f469b87dc8b1","Type":"ContainerDied","Data":"5e2e39635d261d42775f8badd63afb278e7cf3e7ca2fd46bc058d62434dfa683"} Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.139957 4749 scope.go:117] "RemoveContainer" containerID="2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.138306 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lzfmx" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.162978 4749 scope.go:117] "RemoveContainer" containerID="80934dcc21f3791c8dc5271476b51c58dbb762345d298dc16348f674290ee6d4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.186570 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lzfmx"] Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.199751 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lzfmx"] Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.205702 4749 scope.go:117] "RemoveContainer" containerID="1c933e7696a7ec1708df657095aef67fdbc3d0470e8eaa5caf2207a78580a9e9" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.255922 4749 scope.go:117] "RemoveContainer" containerID="2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec" Oct 01 13:29:42 crc kubenswrapper[4749]: E1001 13:29:42.256472 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec\": container with ID starting with 2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec not found: ID does not exist" containerID="2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.256505 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec"} err="failed to get container status \"2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec\": rpc error: code = NotFound desc = could not find container \"2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec\": container with ID starting with 2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec not found: ID does not exist" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.256524 4749 scope.go:117] "RemoveContainer" containerID="80934dcc21f3791c8dc5271476b51c58dbb762345d298dc16348f674290ee6d4" Oct 01 13:29:42 crc kubenswrapper[4749]: E1001 13:29:42.256840 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80934dcc21f3791c8dc5271476b51c58dbb762345d298dc16348f674290ee6d4\": container with ID starting with 80934dcc21f3791c8dc5271476b51c58dbb762345d298dc16348f674290ee6d4 not found: ID does not exist" containerID="80934dcc21f3791c8dc5271476b51c58dbb762345d298dc16348f674290ee6d4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.256866 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80934dcc21f3791c8dc5271476b51c58dbb762345d298dc16348f674290ee6d4"} err="failed to get container status \"80934dcc21f3791c8dc5271476b51c58dbb762345d298dc16348f674290ee6d4\": rpc error: code = NotFound desc = could not find container \"80934dcc21f3791c8dc5271476b51c58dbb762345d298dc16348f674290ee6d4\": container with ID starting with 80934dcc21f3791c8dc5271476b51c58dbb762345d298dc16348f674290ee6d4 not found: ID does not exist" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.256887 4749 scope.go:117] "RemoveContainer" containerID="1c933e7696a7ec1708df657095aef67fdbc3d0470e8eaa5caf2207a78580a9e9" Oct 01 13:29:42 crc kubenswrapper[4749]: E1001 13:29:42.257109 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c933e7696a7ec1708df657095aef67fdbc3d0470e8eaa5caf2207a78580a9e9\": container with ID starting with 1c933e7696a7ec1708df657095aef67fdbc3d0470e8eaa5caf2207a78580a9e9 not found: ID does not exist" containerID="1c933e7696a7ec1708df657095aef67fdbc3d0470e8eaa5caf2207a78580a9e9" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.257132 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c933e7696a7ec1708df657095aef67fdbc3d0470e8eaa5caf2207a78580a9e9"} err="failed to get container status \"1c933e7696a7ec1708df657095aef67fdbc3d0470e8eaa5caf2207a78580a9e9\": rpc error: code = NotFound desc = could not find container \"1c933e7696a7ec1708df657095aef67fdbc3d0470e8eaa5caf2207a78580a9e9\": container with ID starting with 1c933e7696a7ec1708df657095aef67fdbc3d0470e8eaa5caf2207a78580a9e9 not found: ID does not exist" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.581335 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.666743 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79c7b4b87f-d7lt5"] Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.667050 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" podUID="456a2974-f990-4ace-a841-a20ea0787247" containerName="dnsmasq-dns" containerID="cri-o://bfcc875600a9130419aaec6147b5db7d06dad82c8071201adf605ca0b1109fec" gracePeriod=10 Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.783315 4749 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" podUID="456a2974-f990-4ace-a841-a20ea0787247" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.216:5353: connect: connection refused" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.823714 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ff49d554c-jx4l4"] Oct 01 13:29:42 crc kubenswrapper[4749]: E1001 13:29:42.825031 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53163753-4229-4d98-b47d-f469b87dc8b1" containerName="extract-utilities" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.825216 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="53163753-4229-4d98-b47d-f469b87dc8b1" containerName="extract-utilities" Oct 01 13:29:42 crc kubenswrapper[4749]: E1001 13:29:42.825315 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53163753-4229-4d98-b47d-f469b87dc8b1" containerName="extract-content" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.825381 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="53163753-4229-4d98-b47d-f469b87dc8b1" containerName="extract-content" Oct 01 13:29:42 crc kubenswrapper[4749]: E1001 13:29:42.825478 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53163753-4229-4d98-b47d-f469b87dc8b1" containerName="registry-server" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.825551 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="53163753-4229-4d98-b47d-f469b87dc8b1" containerName="registry-server" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.825866 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="53163753-4229-4d98-b47d-f469b87dc8b1" containerName="registry-server" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.827288 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.841863 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff49d554c-jx4l4"] Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.877348 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-ovsdbserver-nb\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.877659 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kjg6\" (UniqueName: \"kubernetes.io/projected/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-kube-api-access-8kjg6\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.877742 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-dns-svc\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.877882 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-openstack-edpm-ipam\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.877980 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-ovsdbserver-sb\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.878093 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-config\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.878166 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-dns-swift-storage-0\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.980511 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-ovsdbserver-nb\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.980562 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kjg6\" (UniqueName: \"kubernetes.io/projected/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-kube-api-access-8kjg6\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.980579 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-dns-svc\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.980617 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-openstack-edpm-ipam\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.980667 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-ovsdbserver-sb\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.980730 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-config\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.981205 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-dns-swift-storage-0\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.982466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-dns-swift-storage-0\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.982776 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-ovsdbserver-sb\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.982816 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-dns-svc\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.983192 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-openstack-edpm-ipam\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.983204 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-ovsdbserver-nb\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:42 crc kubenswrapper[4749]: I1001 13:29:42.983552 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-config\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.003978 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kjg6\" (UniqueName: \"kubernetes.io/projected/1c268c09-ae8f-49b8-916f-b5ce032bfaf1-kube-api-access-8kjg6\") pod \"dnsmasq-dns-6ff49d554c-jx4l4\" (UID: \"1c268c09-ae8f-49b8-916f-b5ce032bfaf1\") " pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.154475 4749 generic.go:334] "Generic (PLEG): container finished" podID="456a2974-f990-4ace-a841-a20ea0787247" containerID="bfcc875600a9130419aaec6147b5db7d06dad82c8071201adf605ca0b1109fec" exitCode=0 Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.154512 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" event={"ID":"456a2974-f990-4ace-a841-a20ea0787247","Type":"ContainerDied","Data":"bfcc875600a9130419aaec6147b5db7d06dad82c8071201adf605ca0b1109fec"} Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.176461 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.242999 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53163753-4229-4d98-b47d-f469b87dc8b1" path="/var/lib/kubelet/pods/53163753-4229-4d98-b47d-f469b87dc8b1/volumes" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.343089 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.389346 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-ovsdbserver-sb\") pod \"456a2974-f990-4ace-a841-a20ea0787247\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.389483 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-dns-swift-storage-0\") pod \"456a2974-f990-4ace-a841-a20ea0787247\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.389557 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-config\") pod \"456a2974-f990-4ace-a841-a20ea0787247\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.389604 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7xr4\" (UniqueName: \"kubernetes.io/projected/456a2974-f990-4ace-a841-a20ea0787247-kube-api-access-f7xr4\") pod \"456a2974-f990-4ace-a841-a20ea0787247\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.389658 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-ovsdbserver-nb\") pod \"456a2974-f990-4ace-a841-a20ea0787247\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.389720 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-dns-svc\") pod \"456a2974-f990-4ace-a841-a20ea0787247\" (UID: \"456a2974-f990-4ace-a841-a20ea0787247\") " Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.395238 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/456a2974-f990-4ace-a841-a20ea0787247-kube-api-access-f7xr4" (OuterVolumeSpecName: "kube-api-access-f7xr4") pod "456a2974-f990-4ace-a841-a20ea0787247" (UID: "456a2974-f990-4ace-a841-a20ea0787247"). InnerVolumeSpecName "kube-api-access-f7xr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.494094 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "456a2974-f990-4ace-a841-a20ea0787247" (UID: "456a2974-f990-4ace-a841-a20ea0787247"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.494325 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.494349 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7xr4\" (UniqueName: \"kubernetes.io/projected/456a2974-f990-4ace-a841-a20ea0787247-kube-api-access-f7xr4\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.499389 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "456a2974-f990-4ace-a841-a20ea0787247" (UID: "456a2974-f990-4ace-a841-a20ea0787247"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.512466 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "456a2974-f990-4ace-a841-a20ea0787247" (UID: "456a2974-f990-4ace-a841-a20ea0787247"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.512493 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-config" (OuterVolumeSpecName: "config") pod "456a2974-f990-4ace-a841-a20ea0787247" (UID: "456a2974-f990-4ace-a841-a20ea0787247"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.536756 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "456a2974-f990-4ace-a841-a20ea0787247" (UID: "456a2974-f990-4ace-a841-a20ea0787247"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.595929 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.595969 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.595979 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.595989 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/456a2974-f990-4ace-a841-a20ea0787247-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:43 crc kubenswrapper[4749]: I1001 13:29:43.668861 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff49d554c-jx4l4"] Oct 01 13:29:44 crc kubenswrapper[4749]: I1001 13:29:44.164480 4749 generic.go:334] "Generic (PLEG): container finished" podID="1c268c09-ae8f-49b8-916f-b5ce032bfaf1" containerID="9439ea0a98fa165e68090bca9f0c1660ba441410ad3dc8eeec9f4855e9cc6505" exitCode=0 Oct 01 13:29:44 crc kubenswrapper[4749]: I1001 13:29:44.164781 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" event={"ID":"1c268c09-ae8f-49b8-916f-b5ce032bfaf1","Type":"ContainerDied","Data":"9439ea0a98fa165e68090bca9f0c1660ba441410ad3dc8eeec9f4855e9cc6505"} Oct 01 13:29:44 crc kubenswrapper[4749]: I1001 13:29:44.164817 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" event={"ID":"1c268c09-ae8f-49b8-916f-b5ce032bfaf1","Type":"ContainerStarted","Data":"7ea6c26d6448d6726152a5a91c4cae85b24213dfdd3d7374ffb820dbc634788f"} Oct 01 13:29:44 crc kubenswrapper[4749]: I1001 13:29:44.167510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" event={"ID":"456a2974-f990-4ace-a841-a20ea0787247","Type":"ContainerDied","Data":"89fe1422939946d91d7cd4827d137d4ebb1fd719a7e0026a892eb116aa48574b"} Oct 01 13:29:44 crc kubenswrapper[4749]: I1001 13:29:44.167571 4749 scope.go:117] "RemoveContainer" containerID="bfcc875600a9130419aaec6147b5db7d06dad82c8071201adf605ca0b1109fec" Oct 01 13:29:44 crc kubenswrapper[4749]: I1001 13:29:44.167603 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79c7b4b87f-d7lt5" Oct 01 13:29:44 crc kubenswrapper[4749]: I1001 13:29:44.220991 4749 scope.go:117] "RemoveContainer" containerID="88e32108ddfafe1d0f2afd2a67bd001ecde8c5b911bbab178a7a0e22361a1d20" Oct 01 13:29:44 crc kubenswrapper[4749]: I1001 13:29:44.226012 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79c7b4b87f-d7lt5"] Oct 01 13:29:44 crc kubenswrapper[4749]: I1001 13:29:44.234951 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79c7b4b87f-d7lt5"] Oct 01 13:29:45 crc kubenswrapper[4749]: I1001 13:29:45.187067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" event={"ID":"1c268c09-ae8f-49b8-916f-b5ce032bfaf1","Type":"ContainerStarted","Data":"897dd493874228202879fabc212eb9ffd53ebd775e067c28884dbdbb47f216bf"} Oct 01 13:29:45 crc kubenswrapper[4749]: I1001 13:29:45.187562 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:45 crc kubenswrapper[4749]: I1001 13:29:45.229256 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" podStartSLOduration=3.229231256 podStartE2EDuration="3.229231256s" podCreationTimestamp="2025-10-01 13:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:29:45.206074757 +0000 UTC m=+1445.260059666" watchObservedRunningTime="2025-10-01 13:29:45.229231256 +0000 UTC m=+1445.283216165" Oct 01 13:29:45 crc kubenswrapper[4749]: I1001 13:29:45.253729 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="456a2974-f990-4ace-a841-a20ea0787247" path="/var/lib/kubelet/pods/456a2974-f990-4ace-a841-a20ea0787247/volumes" Oct 01 13:29:47 crc kubenswrapper[4749]: E1001 13:29:47.877202 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53163753_4229_4d98_b47d_f469b87dc8b1.slice/crio-conmon-2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.179505 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6ff49d554c-jx4l4" Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.310183 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5864d459-zjnc6"] Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.314722 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" podUID="2e200dd4-74df-427c-851a-8b2e993f186b" containerName="dnsmasq-dns" containerID="cri-o://3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e" gracePeriod=10 Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.873593 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.934838 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-ovsdbserver-nb\") pod \"2e200dd4-74df-427c-851a-8b2e993f186b\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.935082 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-dns-svc\") pod \"2e200dd4-74df-427c-851a-8b2e993f186b\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.935185 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-openstack-edpm-ipam\") pod \"2e200dd4-74df-427c-851a-8b2e993f186b\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.935255 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpngw\" (UniqueName: \"kubernetes.io/projected/2e200dd4-74df-427c-851a-8b2e993f186b-kube-api-access-zpngw\") pod \"2e200dd4-74df-427c-851a-8b2e993f186b\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.935380 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-config\") pod \"2e200dd4-74df-427c-851a-8b2e993f186b\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.935433 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-dns-swift-storage-0\") pod \"2e200dd4-74df-427c-851a-8b2e993f186b\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.935493 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-ovsdbserver-sb\") pod \"2e200dd4-74df-427c-851a-8b2e993f186b\" (UID: \"2e200dd4-74df-427c-851a-8b2e993f186b\") " Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.941029 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e200dd4-74df-427c-851a-8b2e993f186b-kube-api-access-zpngw" (OuterVolumeSpecName: "kube-api-access-zpngw") pod "2e200dd4-74df-427c-851a-8b2e993f186b" (UID: "2e200dd4-74df-427c-851a-8b2e993f186b"). InnerVolumeSpecName "kube-api-access-zpngw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:29:53 crc kubenswrapper[4749]: I1001 13:29:53.994438 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "2e200dd4-74df-427c-851a-8b2e993f186b" (UID: "2e200dd4-74df-427c-851a-8b2e993f186b"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.003793 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-config" (OuterVolumeSpecName: "config") pod "2e200dd4-74df-427c-851a-8b2e993f186b" (UID: "2e200dd4-74df-427c-851a-8b2e993f186b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.006368 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e200dd4-74df-427c-851a-8b2e993f186b" (UID: "2e200dd4-74df-427c-851a-8b2e993f186b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.007133 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e200dd4-74df-427c-851a-8b2e993f186b" (UID: "2e200dd4-74df-427c-851a-8b2e993f186b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.017092 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2e200dd4-74df-427c-851a-8b2e993f186b" (UID: "2e200dd4-74df-427c-851a-8b2e993f186b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.021355 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e200dd4-74df-427c-851a-8b2e993f186b" (UID: "2e200dd4-74df-427c-851a-8b2e993f186b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.038170 4749 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.038205 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.038234 4749 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.038248 4749 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.038260 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.038272 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpngw\" (UniqueName: \"kubernetes.io/projected/2e200dd4-74df-427c-851a-8b2e993f186b-kube-api-access-zpngw\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.038284 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e200dd4-74df-427c-851a-8b2e993f186b-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.319098 4749 generic.go:334] "Generic (PLEG): container finished" podID="2e200dd4-74df-427c-851a-8b2e993f186b" containerID="3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e" exitCode=0 Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.319185 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" event={"ID":"2e200dd4-74df-427c-851a-8b2e993f186b","Type":"ContainerDied","Data":"3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e"} Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.319331 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" event={"ID":"2e200dd4-74df-427c-851a-8b2e993f186b","Type":"ContainerDied","Data":"54015c60763edea90f938c37e0cd2a0273d4dbfa88e1e2678653fb0af4bc6cae"} Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.319569 4749 scope.go:117] "RemoveContainer" containerID="3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.319850 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5864d459-zjnc6" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.348179 4749 scope.go:117] "RemoveContainer" containerID="dfabb68ecf10264557e7242ad392084e479245c3a4be8ed94328bddaedff74f7" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.373590 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5864d459-zjnc6"] Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.377791 4749 scope.go:117] "RemoveContainer" containerID="3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e" Oct 01 13:29:54 crc kubenswrapper[4749]: E1001 13:29:54.378254 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e\": container with ID starting with 3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e not found: ID does not exist" containerID="3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.378369 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e"} err="failed to get container status \"3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e\": rpc error: code = NotFound desc = could not find container \"3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e\": container with ID starting with 3dbeb65ac4b7a42c99110425e853ae59809485dd62456d4dc14e6b23c5d90d3e not found: ID does not exist" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.378466 4749 scope.go:117] "RemoveContainer" containerID="dfabb68ecf10264557e7242ad392084e479245c3a4be8ed94328bddaedff74f7" Oct 01 13:29:54 crc kubenswrapper[4749]: E1001 13:29:54.380524 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfabb68ecf10264557e7242ad392084e479245c3a4be8ed94328bddaedff74f7\": container with ID starting with dfabb68ecf10264557e7242ad392084e479245c3a4be8ed94328bddaedff74f7 not found: ID does not exist" containerID="dfabb68ecf10264557e7242ad392084e479245c3a4be8ed94328bddaedff74f7" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.380594 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfabb68ecf10264557e7242ad392084e479245c3a4be8ed94328bddaedff74f7"} err="failed to get container status \"dfabb68ecf10264557e7242ad392084e479245c3a4be8ed94328bddaedff74f7\": rpc error: code = NotFound desc = could not find container \"dfabb68ecf10264557e7242ad392084e479245c3a4be8ed94328bddaedff74f7\": container with ID starting with dfabb68ecf10264557e7242ad392084e479245c3a4be8ed94328bddaedff74f7 not found: ID does not exist" Oct 01 13:29:54 crc kubenswrapper[4749]: I1001 13:29:54.383139 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5864d459-zjnc6"] Oct 01 13:29:55 crc kubenswrapper[4749]: I1001 13:29:55.252403 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e200dd4-74df-427c-851a-8b2e993f186b" path="/var/lib/kubelet/pods/2e200dd4-74df-427c-851a-8b2e993f186b/volumes" Oct 01 13:29:57 crc kubenswrapper[4749]: I1001 13:29:57.367485 4749 generic.go:334] "Generic (PLEG): container finished" podID="5b5aa915-bf5a-4046-834c-6051ed420f42" containerID="5368ade1a927c23b68b5dd32066a35b83f41e3408e454c1e9b751f52ae1f00e3" exitCode=0 Oct 01 13:29:57 crc kubenswrapper[4749]: I1001 13:29:57.367587 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b5aa915-bf5a-4046-834c-6051ed420f42","Type":"ContainerDied","Data":"5368ade1a927c23b68b5dd32066a35b83f41e3408e454c1e9b751f52ae1f00e3"} Oct 01 13:29:58 crc kubenswrapper[4749]: E1001 13:29:58.204461 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53163753_4229_4d98_b47d_f469b87dc8b1.slice/crio-conmon-2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:29:58 crc kubenswrapper[4749]: I1001 13:29:58.379531 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5b5aa915-bf5a-4046-834c-6051ed420f42","Type":"ContainerStarted","Data":"5ccbd3b72fb2e9270cac4255d473b7c350f722839dc8313b4af2bf117443d648"} Oct 01 13:29:58 crc kubenswrapper[4749]: I1001 13:29:58.380886 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 13:29:58 crc kubenswrapper[4749]: I1001 13:29:58.383995 4749 generic.go:334] "Generic (PLEG): container finished" podID="811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7" containerID="e6543fb74f3e73522f6b4c9c4a80f31d90da328b9449c79d7baef9050e3d21f1" exitCode=0 Oct 01 13:29:58 crc kubenswrapper[4749]: I1001 13:29:58.384100 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7","Type":"ContainerDied","Data":"e6543fb74f3e73522f6b4c9c4a80f31d90da328b9449c79d7baef9050e3d21f1"} Oct 01 13:29:58 crc kubenswrapper[4749]: I1001 13:29:58.410675 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.410649541 podStartE2EDuration="37.410649541s" podCreationTimestamp="2025-10-01 13:29:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:29:58.403025727 +0000 UTC m=+1458.457010646" watchObservedRunningTime="2025-10-01 13:29:58.410649541 +0000 UTC m=+1458.464634460" Oct 01 13:29:59 crc kubenswrapper[4749]: I1001 13:29:59.395027 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7","Type":"ContainerStarted","Data":"e25744de051cf38f863f84fff390bfec107d3b32373d7db6e4b6712443320aa1"} Oct 01 13:29:59 crc kubenswrapper[4749]: I1001 13:29:59.395555 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.147867 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.147844476 podStartE2EDuration="38.147844476s" podCreationTimestamp="2025-10-01 13:29:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:29:59.419648122 +0000 UTC m=+1459.473633031" watchObservedRunningTime="2025-10-01 13:30:00.147844476 +0000 UTC m=+1460.201829385" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.161424 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn"] Oct 01 13:30:00 crc kubenswrapper[4749]: E1001 13:30:00.162048 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456a2974-f990-4ace-a841-a20ea0787247" containerName="dnsmasq-dns" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.162075 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="456a2974-f990-4ace-a841-a20ea0787247" containerName="dnsmasq-dns" Oct 01 13:30:00 crc kubenswrapper[4749]: E1001 13:30:00.162117 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e200dd4-74df-427c-851a-8b2e993f186b" containerName="init" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.162135 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e200dd4-74df-427c-851a-8b2e993f186b" containerName="init" Oct 01 13:30:00 crc kubenswrapper[4749]: E1001 13:30:00.162264 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="456a2974-f990-4ace-a841-a20ea0787247" containerName="init" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.162285 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="456a2974-f990-4ace-a841-a20ea0787247" containerName="init" Oct 01 13:30:00 crc kubenswrapper[4749]: E1001 13:30:00.162318 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e200dd4-74df-427c-851a-8b2e993f186b" containerName="dnsmasq-dns" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.162331 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e200dd4-74df-427c-851a-8b2e993f186b" containerName="dnsmasq-dns" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.162662 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e200dd4-74df-427c-851a-8b2e993f186b" containerName="dnsmasq-dns" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.162696 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="456a2974-f990-4ace-a841-a20ea0787247" containerName="dnsmasq-dns" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.163744 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.165902 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.166314 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.180982 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn"] Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.188369 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f6bfb71-014e-4eca-8a3f-5e93745f039f-config-volume\") pod \"collect-profiles-29322090-tbrsn\" (UID: \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.188454 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f6bfb71-014e-4eca-8a3f-5e93745f039f-secret-volume\") pod \"collect-profiles-29322090-tbrsn\" (UID: \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.188569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hchr\" (UniqueName: \"kubernetes.io/projected/5f6bfb71-014e-4eca-8a3f-5e93745f039f-kube-api-access-6hchr\") pod \"collect-profiles-29322090-tbrsn\" (UID: \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.291191 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f6bfb71-014e-4eca-8a3f-5e93745f039f-config-volume\") pod \"collect-profiles-29322090-tbrsn\" (UID: \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.291281 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f6bfb71-014e-4eca-8a3f-5e93745f039f-secret-volume\") pod \"collect-profiles-29322090-tbrsn\" (UID: \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.291346 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hchr\" (UniqueName: \"kubernetes.io/projected/5f6bfb71-014e-4eca-8a3f-5e93745f039f-kube-api-access-6hchr\") pod \"collect-profiles-29322090-tbrsn\" (UID: \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.293203 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f6bfb71-014e-4eca-8a3f-5e93745f039f-config-volume\") pod \"collect-profiles-29322090-tbrsn\" (UID: \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.306959 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f6bfb71-014e-4eca-8a3f-5e93745f039f-secret-volume\") pod \"collect-profiles-29322090-tbrsn\" (UID: \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.314135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hchr\" (UniqueName: \"kubernetes.io/projected/5f6bfb71-014e-4eca-8a3f-5e93745f039f-kube-api-access-6hchr\") pod \"collect-profiles-29322090-tbrsn\" (UID: \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.509607 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:00 crc kubenswrapper[4749]: I1001 13:30:00.981835 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn"] Oct 01 13:30:01 crc kubenswrapper[4749]: I1001 13:30:01.416805 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" event={"ID":"5f6bfb71-014e-4eca-8a3f-5e93745f039f","Type":"ContainerStarted","Data":"23ea4160487656a8788c18b91e0711534428f68278dd4495e03ad94b070dc433"} Oct 01 13:30:01 crc kubenswrapper[4749]: I1001 13:30:01.416845 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" event={"ID":"5f6bfb71-014e-4eca-8a3f-5e93745f039f","Type":"ContainerStarted","Data":"642290a9b9b857a6811330f6ed1d0b245b86b36802aa8dfe2b7d7f73660d4a1c"} Oct 01 13:30:01 crc kubenswrapper[4749]: I1001 13:30:01.434116 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" podStartSLOduration=1.433902054 podStartE2EDuration="1.433902054s" podCreationTimestamp="2025-10-01 13:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:30:01.432561525 +0000 UTC m=+1461.486546424" watchObservedRunningTime="2025-10-01 13:30:01.433902054 +0000 UTC m=+1461.487886953" Oct 01 13:30:02 crc kubenswrapper[4749]: I1001 13:30:02.110761 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:30:02 crc kubenswrapper[4749]: I1001 13:30:02.111154 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:30:02 crc kubenswrapper[4749]: I1001 13:30:02.427123 4749 generic.go:334] "Generic (PLEG): container finished" podID="5f6bfb71-014e-4eca-8a3f-5e93745f039f" containerID="23ea4160487656a8788c18b91e0711534428f68278dd4495e03ad94b070dc433" exitCode=0 Oct 01 13:30:02 crc kubenswrapper[4749]: I1001 13:30:02.427167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" event={"ID":"5f6bfb71-014e-4eca-8a3f-5e93745f039f","Type":"ContainerDied","Data":"23ea4160487656a8788c18b91e0711534428f68278dd4495e03ad94b070dc433"} Oct 01 13:30:03 crc kubenswrapper[4749]: I1001 13:30:03.904837 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:04 crc kubenswrapper[4749]: I1001 13:30:04.068347 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hchr\" (UniqueName: \"kubernetes.io/projected/5f6bfb71-014e-4eca-8a3f-5e93745f039f-kube-api-access-6hchr\") pod \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\" (UID: \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\") " Oct 01 13:30:04 crc kubenswrapper[4749]: I1001 13:30:04.068433 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f6bfb71-014e-4eca-8a3f-5e93745f039f-secret-volume\") pod \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\" (UID: \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\") " Oct 01 13:30:04 crc kubenswrapper[4749]: I1001 13:30:04.068600 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f6bfb71-014e-4eca-8a3f-5e93745f039f-config-volume\") pod \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\" (UID: \"5f6bfb71-014e-4eca-8a3f-5e93745f039f\") " Oct 01 13:30:04 crc kubenswrapper[4749]: I1001 13:30:04.069396 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f6bfb71-014e-4eca-8a3f-5e93745f039f-config-volume" (OuterVolumeSpecName: "config-volume") pod "5f6bfb71-014e-4eca-8a3f-5e93745f039f" (UID: "5f6bfb71-014e-4eca-8a3f-5e93745f039f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:30:04 crc kubenswrapper[4749]: I1001 13:30:04.076583 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f6bfb71-014e-4eca-8a3f-5e93745f039f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5f6bfb71-014e-4eca-8a3f-5e93745f039f" (UID: "5f6bfb71-014e-4eca-8a3f-5e93745f039f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:04 crc kubenswrapper[4749]: I1001 13:30:04.078224 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f6bfb71-014e-4eca-8a3f-5e93745f039f-kube-api-access-6hchr" (OuterVolumeSpecName: "kube-api-access-6hchr") pod "5f6bfb71-014e-4eca-8a3f-5e93745f039f" (UID: "5f6bfb71-014e-4eca-8a3f-5e93745f039f"). InnerVolumeSpecName "kube-api-access-6hchr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:04 crc kubenswrapper[4749]: I1001 13:30:04.170741 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f6bfb71-014e-4eca-8a3f-5e93745f039f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:04 crc kubenswrapper[4749]: I1001 13:30:04.170778 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hchr\" (UniqueName: \"kubernetes.io/projected/5f6bfb71-014e-4eca-8a3f-5e93745f039f-kube-api-access-6hchr\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:04 crc kubenswrapper[4749]: I1001 13:30:04.170789 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f6bfb71-014e-4eca-8a3f-5e93745f039f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:04 crc kubenswrapper[4749]: I1001 13:30:04.455063 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" event={"ID":"5f6bfb71-014e-4eca-8a3f-5e93745f039f","Type":"ContainerDied","Data":"642290a9b9b857a6811330f6ed1d0b245b86b36802aa8dfe2b7d7f73660d4a1c"} Oct 01 13:30:04 crc kubenswrapper[4749]: I1001 13:30:04.455114 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="642290a9b9b857a6811330f6ed1d0b245b86b36802aa8dfe2b7d7f73660d4a1c" Oct 01 13:30:04 crc kubenswrapper[4749]: I1001 13:30:04.455201 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn" Oct 01 13:30:08 crc kubenswrapper[4749]: E1001 13:30:08.501169 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53163753_4229_4d98_b47d_f469b87dc8b1.slice/crio-conmon-2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.535881 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n"] Oct 01 13:30:11 crc kubenswrapper[4749]: E1001 13:30:11.536807 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f6bfb71-014e-4eca-8a3f-5e93745f039f" containerName="collect-profiles" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.536826 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f6bfb71-014e-4eca-8a3f-5e93745f039f" containerName="collect-profiles" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.537151 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f6bfb71-014e-4eca-8a3f-5e93745f039f" containerName="collect-profiles" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.537939 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.540317 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.540834 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.541030 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.541198 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.550416 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n"] Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.711578 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pxvs\" (UniqueName: \"kubernetes.io/projected/ef750054-fd5c-408e-bd33-90e1a43d8a86-kube-api-access-6pxvs\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n659n\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.711664 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n659n\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.711777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n659n\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.711817 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n659n\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.813596 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n659n\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.813656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n659n\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.813715 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pxvs\" (UniqueName: \"kubernetes.io/projected/ef750054-fd5c-408e-bd33-90e1a43d8a86-kube-api-access-6pxvs\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n659n\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.813767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n659n\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.819696 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n659n\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.821264 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n659n\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.821751 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n659n\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.842957 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pxvs\" (UniqueName: \"kubernetes.io/projected/ef750054-fd5c-408e-bd33-90e1a43d8a86-kube-api-access-6pxvs\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-n659n\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:11 crc kubenswrapper[4749]: I1001 13:30:11.901078 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:12 crc kubenswrapper[4749]: I1001 13:30:12.457402 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 13:30:12 crc kubenswrapper[4749]: I1001 13:30:12.540153 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n"] Oct 01 13:30:13 crc kubenswrapper[4749]: I1001 13:30:13.306378 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 13:30:13 crc kubenswrapper[4749]: I1001 13:30:13.551760 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" event={"ID":"ef750054-fd5c-408e-bd33-90e1a43d8a86","Type":"ContainerStarted","Data":"864a41e806d8d327829feb088719e154b69c3f4520b4995497efaf2262b4c8c9"} Oct 01 13:30:18 crc kubenswrapper[4749]: E1001 13:30:18.775125 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53163753_4229_4d98_b47d_f469b87dc8b1.slice/crio-conmon-2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:30:23 crc kubenswrapper[4749]: I1001 13:30:23.667993 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" event={"ID":"ef750054-fd5c-408e-bd33-90e1a43d8a86","Type":"ContainerStarted","Data":"b342a1c633c616f07d534ad715e5303c19bea03ee202c7b9cf376e7629e94880"} Oct 01 13:30:23 crc kubenswrapper[4749]: I1001 13:30:23.698489 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" podStartSLOduration=2.818801813 podStartE2EDuration="12.698458093s" podCreationTimestamp="2025-10-01 13:30:11 +0000 UTC" firstStartedPulling="2025-10-01 13:30:12.546117705 +0000 UTC m=+1472.600102624" lastFinishedPulling="2025-10-01 13:30:22.425773965 +0000 UTC m=+1482.479758904" observedRunningTime="2025-10-01 13:30:23.687863822 +0000 UTC m=+1483.741848761" watchObservedRunningTime="2025-10-01 13:30:23.698458093 +0000 UTC m=+1483.752443032" Oct 01 13:30:29 crc kubenswrapper[4749]: E1001 13:30:29.047983 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53163753_4229_4d98_b47d_f469b87dc8b1.slice/crio-conmon-2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.106368 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.106975 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.373779 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k4gtq"] Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.375822 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.408498 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4gtq"] Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.479783 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99e09f46-5b73-4300-8633-2af8a358e21f-catalog-content\") pod \"certified-operators-k4gtq\" (UID: \"99e09f46-5b73-4300-8633-2af8a358e21f\") " pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.479891 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99e09f46-5b73-4300-8633-2af8a358e21f-utilities\") pod \"certified-operators-k4gtq\" (UID: \"99e09f46-5b73-4300-8633-2af8a358e21f\") " pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.479950 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6g7m\" (UniqueName: \"kubernetes.io/projected/99e09f46-5b73-4300-8633-2af8a358e21f-kube-api-access-h6g7m\") pod \"certified-operators-k4gtq\" (UID: \"99e09f46-5b73-4300-8633-2af8a358e21f\") " pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.582438 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99e09f46-5b73-4300-8633-2af8a358e21f-catalog-content\") pod \"certified-operators-k4gtq\" (UID: \"99e09f46-5b73-4300-8633-2af8a358e21f\") " pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.582930 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99e09f46-5b73-4300-8633-2af8a358e21f-utilities\") pod \"certified-operators-k4gtq\" (UID: \"99e09f46-5b73-4300-8633-2af8a358e21f\") " pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.583341 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6g7m\" (UniqueName: \"kubernetes.io/projected/99e09f46-5b73-4300-8633-2af8a358e21f-kube-api-access-h6g7m\") pod \"certified-operators-k4gtq\" (UID: \"99e09f46-5b73-4300-8633-2af8a358e21f\") " pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.583536 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99e09f46-5b73-4300-8633-2af8a358e21f-utilities\") pod \"certified-operators-k4gtq\" (UID: \"99e09f46-5b73-4300-8633-2af8a358e21f\") " pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.583446 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99e09f46-5b73-4300-8633-2af8a358e21f-catalog-content\") pod \"certified-operators-k4gtq\" (UID: \"99e09f46-5b73-4300-8633-2af8a358e21f\") " pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.604549 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6g7m\" (UniqueName: \"kubernetes.io/projected/99e09f46-5b73-4300-8633-2af8a358e21f-kube-api-access-h6g7m\") pod \"certified-operators-k4gtq\" (UID: \"99e09f46-5b73-4300-8633-2af8a358e21f\") " pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:32 crc kubenswrapper[4749]: I1001 13:30:32.722377 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:33 crc kubenswrapper[4749]: I1001 13:30:33.198805 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4gtq"] Oct 01 13:30:33 crc kubenswrapper[4749]: W1001 13:30:33.206340 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99e09f46_5b73_4300_8633_2af8a358e21f.slice/crio-e1e990153cfca1ab56e3c3b5c9a89ded30a219dd27dfe630c4b197615e3c1fc3 WatchSource:0}: Error finding container e1e990153cfca1ab56e3c3b5c9a89ded30a219dd27dfe630c4b197615e3c1fc3: Status 404 returned error can't find the container with id e1e990153cfca1ab56e3c3b5c9a89ded30a219dd27dfe630c4b197615e3c1fc3 Oct 01 13:30:33 crc kubenswrapper[4749]: I1001 13:30:33.792112 4749 generic.go:334] "Generic (PLEG): container finished" podID="99e09f46-5b73-4300-8633-2af8a358e21f" containerID="52711501b2c32ad093f7e4361e931c81ff00aec8619aef8172739a9f93c16e57" exitCode=0 Oct 01 13:30:33 crc kubenswrapper[4749]: I1001 13:30:33.792167 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4gtq" event={"ID":"99e09f46-5b73-4300-8633-2af8a358e21f","Type":"ContainerDied","Data":"52711501b2c32ad093f7e4361e931c81ff00aec8619aef8172739a9f93c16e57"} Oct 01 13:30:33 crc kubenswrapper[4749]: I1001 13:30:33.792522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4gtq" event={"ID":"99e09f46-5b73-4300-8633-2af8a358e21f","Type":"ContainerStarted","Data":"e1e990153cfca1ab56e3c3b5c9a89ded30a219dd27dfe630c4b197615e3c1fc3"} Oct 01 13:30:34 crc kubenswrapper[4749]: I1001 13:30:34.825288 4749 generic.go:334] "Generic (PLEG): container finished" podID="ef750054-fd5c-408e-bd33-90e1a43d8a86" containerID="b342a1c633c616f07d534ad715e5303c19bea03ee202c7b9cf376e7629e94880" exitCode=0 Oct 01 13:30:34 crc kubenswrapper[4749]: I1001 13:30:34.825518 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" event={"ID":"ef750054-fd5c-408e-bd33-90e1a43d8a86","Type":"ContainerDied","Data":"b342a1c633c616f07d534ad715e5303c19bea03ee202c7b9cf376e7629e94880"} Oct 01 13:30:35 crc kubenswrapper[4749]: I1001 13:30:35.848656 4749 generic.go:334] "Generic (PLEG): container finished" podID="99e09f46-5b73-4300-8633-2af8a358e21f" containerID="de5ffe8efd0f014828ab3815a23ec62384998c40954a5f319b6b7aad40bdfaa3" exitCode=0 Oct 01 13:30:35 crc kubenswrapper[4749]: I1001 13:30:35.848902 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4gtq" event={"ID":"99e09f46-5b73-4300-8633-2af8a358e21f","Type":"ContainerDied","Data":"de5ffe8efd0f014828ab3815a23ec62384998c40954a5f319b6b7aad40bdfaa3"} Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.418864 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.568725 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-repo-setup-combined-ca-bundle\") pod \"ef750054-fd5c-408e-bd33-90e1a43d8a86\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.569298 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pxvs\" (UniqueName: \"kubernetes.io/projected/ef750054-fd5c-408e-bd33-90e1a43d8a86-kube-api-access-6pxvs\") pod \"ef750054-fd5c-408e-bd33-90e1a43d8a86\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.569355 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-ssh-key\") pod \"ef750054-fd5c-408e-bd33-90e1a43d8a86\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.569418 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-inventory\") pod \"ef750054-fd5c-408e-bd33-90e1a43d8a86\" (UID: \"ef750054-fd5c-408e-bd33-90e1a43d8a86\") " Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.574637 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef750054-fd5c-408e-bd33-90e1a43d8a86-kube-api-access-6pxvs" (OuterVolumeSpecName: "kube-api-access-6pxvs") pod "ef750054-fd5c-408e-bd33-90e1a43d8a86" (UID: "ef750054-fd5c-408e-bd33-90e1a43d8a86"). InnerVolumeSpecName "kube-api-access-6pxvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.576958 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ef750054-fd5c-408e-bd33-90e1a43d8a86" (UID: "ef750054-fd5c-408e-bd33-90e1a43d8a86"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.602876 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-inventory" (OuterVolumeSpecName: "inventory") pod "ef750054-fd5c-408e-bd33-90e1a43d8a86" (UID: "ef750054-fd5c-408e-bd33-90e1a43d8a86"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.613121 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ef750054-fd5c-408e-bd33-90e1a43d8a86" (UID: "ef750054-fd5c-408e-bd33-90e1a43d8a86"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.672052 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pxvs\" (UniqueName: \"kubernetes.io/projected/ef750054-fd5c-408e-bd33-90e1a43d8a86-kube-api-access-6pxvs\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.672102 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.672111 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.672121 4749 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef750054-fd5c-408e-bd33-90e1a43d8a86-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.864971 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4gtq" event={"ID":"99e09f46-5b73-4300-8633-2af8a358e21f","Type":"ContainerStarted","Data":"992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486"} Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.869677 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" event={"ID":"ef750054-fd5c-408e-bd33-90e1a43d8a86","Type":"ContainerDied","Data":"864a41e806d8d327829feb088719e154b69c3f4520b4995497efaf2262b4c8c9"} Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.869719 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="864a41e806d8d327829feb088719e154b69c3f4520b4995497efaf2262b4c8c9" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.869786 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-n659n" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.904967 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k4gtq" podStartSLOduration=2.432679844 podStartE2EDuration="4.904941533s" podCreationTimestamp="2025-10-01 13:30:32 +0000 UTC" firstStartedPulling="2025-10-01 13:30:33.794053238 +0000 UTC m=+1493.848038147" lastFinishedPulling="2025-10-01 13:30:36.266314937 +0000 UTC m=+1496.320299836" observedRunningTime="2025-10-01 13:30:36.893339172 +0000 UTC m=+1496.947324111" watchObservedRunningTime="2025-10-01 13:30:36.904941533 +0000 UTC m=+1496.958926452" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.945955 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx"] Oct 01 13:30:36 crc kubenswrapper[4749]: E1001 13:30:36.946458 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef750054-fd5c-408e-bd33-90e1a43d8a86" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.946482 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef750054-fd5c-408e-bd33-90e1a43d8a86" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.946737 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef750054-fd5c-408e-bd33-90e1a43d8a86" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.947464 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.949697 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.950057 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.950066 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.950293 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:30:36 crc kubenswrapper[4749]: I1001 13:30:36.968860 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx"] Oct 01 13:30:37 crc kubenswrapper[4749]: I1001 13:30:37.079240 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zl4wx\" (UID: \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:37 crc kubenswrapper[4749]: I1001 13:30:37.079594 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jhr7\" (UniqueName: \"kubernetes.io/projected/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-kube-api-access-5jhr7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zl4wx\" (UID: \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:37 crc kubenswrapper[4749]: I1001 13:30:37.079880 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zl4wx\" (UID: \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:37 crc kubenswrapper[4749]: I1001 13:30:37.181497 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zl4wx\" (UID: \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:37 crc kubenswrapper[4749]: I1001 13:30:37.181563 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jhr7\" (UniqueName: \"kubernetes.io/projected/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-kube-api-access-5jhr7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zl4wx\" (UID: \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:37 crc kubenswrapper[4749]: I1001 13:30:37.181655 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zl4wx\" (UID: \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:37 crc kubenswrapper[4749]: I1001 13:30:37.186904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zl4wx\" (UID: \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:37 crc kubenswrapper[4749]: I1001 13:30:37.187919 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zl4wx\" (UID: \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:37 crc kubenswrapper[4749]: I1001 13:30:37.206177 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jhr7\" (UniqueName: \"kubernetes.io/projected/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-kube-api-access-5jhr7\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zl4wx\" (UID: \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:37 crc kubenswrapper[4749]: I1001 13:30:37.275804 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:37 crc kubenswrapper[4749]: I1001 13:30:37.859636 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx"] Oct 01 13:30:37 crc kubenswrapper[4749]: W1001 13:30:37.864978 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae065bff_2fba_4e8b_a734_75cd8b9d1a26.slice/crio-50549079e7267c99b98ffa7d9c8c9328d1f5c0631e8344a8881cf04ed36f8c4c WatchSource:0}: Error finding container 50549079e7267c99b98ffa7d9c8c9328d1f5c0631e8344a8881cf04ed36f8c4c: Status 404 returned error can't find the container with id 50549079e7267c99b98ffa7d9c8c9328d1f5c0631e8344a8881cf04ed36f8c4c Oct 01 13:30:37 crc kubenswrapper[4749]: I1001 13:30:37.881481 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" event={"ID":"ae065bff-2fba-4e8b-a734-75cd8b9d1a26","Type":"ContainerStarted","Data":"50549079e7267c99b98ffa7d9c8c9328d1f5c0631e8344a8881cf04ed36f8c4c"} Oct 01 13:30:38 crc kubenswrapper[4749]: I1001 13:30:38.912842 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" event={"ID":"ae065bff-2fba-4e8b-a734-75cd8b9d1a26","Type":"ContainerStarted","Data":"a5cd52c0efe2373a50f0a4554e409ca37b61ed1d712765e02640dba17eb86e48"} Oct 01 13:30:38 crc kubenswrapper[4749]: I1001 13:30:38.953121 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" podStartSLOduration=2.329399182 podStartE2EDuration="2.95308562s" podCreationTimestamp="2025-10-01 13:30:36 +0000 UTC" firstStartedPulling="2025-10-01 13:30:37.869053137 +0000 UTC m=+1497.923038036" lastFinishedPulling="2025-10-01 13:30:38.492739535 +0000 UTC m=+1498.546724474" observedRunningTime="2025-10-01 13:30:38.938751699 +0000 UTC m=+1498.992736638" watchObservedRunningTime="2025-10-01 13:30:38.95308562 +0000 UTC m=+1499.007070569" Oct 01 13:30:39 crc kubenswrapper[4749]: E1001 13:30:39.300969 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53163753_4229_4d98_b47d_f469b87dc8b1.slice/crio-conmon-2bc43a8058ac29cbde68c2a961474928392f742ee209232ecc327196355046ec.scope\": RecentStats: unable to find data in memory cache]" Oct 01 13:30:41 crc kubenswrapper[4749]: I1001 13:30:41.953338 4749 generic.go:334] "Generic (PLEG): container finished" podID="ae065bff-2fba-4e8b-a734-75cd8b9d1a26" containerID="a5cd52c0efe2373a50f0a4554e409ca37b61ed1d712765e02640dba17eb86e48" exitCode=0 Oct 01 13:30:41 crc kubenswrapper[4749]: I1001 13:30:41.953447 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" event={"ID":"ae065bff-2fba-4e8b-a734-75cd8b9d1a26","Type":"ContainerDied","Data":"a5cd52c0efe2373a50f0a4554e409ca37b61ed1d712765e02640dba17eb86e48"} Oct 01 13:30:42 crc kubenswrapper[4749]: I1001 13:30:42.723359 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:42 crc kubenswrapper[4749]: I1001 13:30:42.723451 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:42 crc kubenswrapper[4749]: I1001 13:30:42.797368 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.039054 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.131946 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4gtq"] Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.495890 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.641895 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jhr7\" (UniqueName: \"kubernetes.io/projected/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-kube-api-access-5jhr7\") pod \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\" (UID: \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\") " Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.642292 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-ssh-key\") pod \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\" (UID: \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\") " Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.642549 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-inventory\") pod \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\" (UID: \"ae065bff-2fba-4e8b-a734-75cd8b9d1a26\") " Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.648302 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-kube-api-access-5jhr7" (OuterVolumeSpecName: "kube-api-access-5jhr7") pod "ae065bff-2fba-4e8b-a734-75cd8b9d1a26" (UID: "ae065bff-2fba-4e8b-a734-75cd8b9d1a26"). InnerVolumeSpecName "kube-api-access-5jhr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.679273 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ae065bff-2fba-4e8b-a734-75cd8b9d1a26" (UID: "ae065bff-2fba-4e8b-a734-75cd8b9d1a26"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.695126 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-inventory" (OuterVolumeSpecName: "inventory") pod "ae065bff-2fba-4e8b-a734-75cd8b9d1a26" (UID: "ae065bff-2fba-4e8b-a734-75cd8b9d1a26"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.744415 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jhr7\" (UniqueName: \"kubernetes.io/projected/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-kube-api-access-5jhr7\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.744441 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.744450 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae065bff-2fba-4e8b-a734-75cd8b9d1a26-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.979443 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.979456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zl4wx" event={"ID":"ae065bff-2fba-4e8b-a734-75cd8b9d1a26","Type":"ContainerDied","Data":"50549079e7267c99b98ffa7d9c8c9328d1f5c0631e8344a8881cf04ed36f8c4c"} Oct 01 13:30:43 crc kubenswrapper[4749]: I1001 13:30:43.979516 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50549079e7267c99b98ffa7d9c8c9328d1f5c0631e8344a8881cf04ed36f8c4c" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.073998 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w"] Oct 01 13:30:44 crc kubenswrapper[4749]: E1001 13:30:44.074434 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae065bff-2fba-4e8b-a734-75cd8b9d1a26" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.074446 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae065bff-2fba-4e8b-a734-75cd8b9d1a26" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.074630 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae065bff-2fba-4e8b-a734-75cd8b9d1a26" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.075256 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.077395 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.078621 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.078865 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.085131 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.086549 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w"] Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.254353 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.254419 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzg2h\" (UniqueName: \"kubernetes.io/projected/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-kube-api-access-kzg2h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.254526 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.254900 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.356998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.357083 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzg2h\" (UniqueName: \"kubernetes.io/projected/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-kube-api-access-kzg2h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.357278 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.357351 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.362544 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.362929 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.367211 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.391332 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzg2h\" (UniqueName: \"kubernetes.io/projected/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-kube-api-access-kzg2h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.395611 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:30:44 crc kubenswrapper[4749]: I1001 13:30:44.993902 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k4gtq" podUID="99e09f46-5b73-4300-8633-2af8a358e21f" containerName="registry-server" containerID="cri-o://992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486" gracePeriod=2 Oct 01 13:30:45 crc kubenswrapper[4749]: W1001 13:30:45.036891 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36105d3f_3305_4cd8_9b9c_4b3d7eaec504.slice/crio-1bbae58c401fa51aa724a06c872381575cfa8d1c35a7ac9448852bf7df4f24a1 WatchSource:0}: Error finding container 1bbae58c401fa51aa724a06c872381575cfa8d1c35a7ac9448852bf7df4f24a1: Status 404 returned error can't find the container with id 1bbae58c401fa51aa724a06c872381575cfa8d1c35a7ac9448852bf7df4f24a1 Oct 01 13:30:45 crc kubenswrapper[4749]: I1001 13:30:45.040036 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w"] Oct 01 13:30:45 crc kubenswrapper[4749]: I1001 13:30:45.422901 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:45 crc kubenswrapper[4749]: I1001 13:30:45.583482 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99e09f46-5b73-4300-8633-2af8a358e21f-utilities\") pod \"99e09f46-5b73-4300-8633-2af8a358e21f\" (UID: \"99e09f46-5b73-4300-8633-2af8a358e21f\") " Oct 01 13:30:45 crc kubenswrapper[4749]: I1001 13:30:45.583983 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6g7m\" (UniqueName: \"kubernetes.io/projected/99e09f46-5b73-4300-8633-2af8a358e21f-kube-api-access-h6g7m\") pod \"99e09f46-5b73-4300-8633-2af8a358e21f\" (UID: \"99e09f46-5b73-4300-8633-2af8a358e21f\") " Oct 01 13:30:45 crc kubenswrapper[4749]: I1001 13:30:45.584148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99e09f46-5b73-4300-8633-2af8a358e21f-catalog-content\") pod \"99e09f46-5b73-4300-8633-2af8a358e21f\" (UID: \"99e09f46-5b73-4300-8633-2af8a358e21f\") " Oct 01 13:30:45 crc kubenswrapper[4749]: I1001 13:30:45.585385 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99e09f46-5b73-4300-8633-2af8a358e21f-utilities" (OuterVolumeSpecName: "utilities") pod "99e09f46-5b73-4300-8633-2af8a358e21f" (UID: "99e09f46-5b73-4300-8633-2af8a358e21f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:30:45 crc kubenswrapper[4749]: I1001 13:30:45.602470 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e09f46-5b73-4300-8633-2af8a358e21f-kube-api-access-h6g7m" (OuterVolumeSpecName: "kube-api-access-h6g7m") pod "99e09f46-5b73-4300-8633-2af8a358e21f" (UID: "99e09f46-5b73-4300-8633-2af8a358e21f"). InnerVolumeSpecName "kube-api-access-h6g7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:30:45 crc kubenswrapper[4749]: I1001 13:30:45.658930 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99e09f46-5b73-4300-8633-2af8a358e21f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "99e09f46-5b73-4300-8633-2af8a358e21f" (UID: "99e09f46-5b73-4300-8633-2af8a358e21f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:30:45 crc kubenswrapper[4749]: I1001 13:30:45.686761 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/99e09f46-5b73-4300-8633-2af8a358e21f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:45 crc kubenswrapper[4749]: I1001 13:30:45.686800 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/99e09f46-5b73-4300-8633-2af8a358e21f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:45 crc kubenswrapper[4749]: I1001 13:30:45.686812 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6g7m\" (UniqueName: \"kubernetes.io/projected/99e09f46-5b73-4300-8633-2af8a358e21f-kube-api-access-h6g7m\") on node \"crc\" DevicePath \"\"" Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.014594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" event={"ID":"36105d3f-3305-4cd8-9b9c-4b3d7eaec504","Type":"ContainerStarted","Data":"1bbae58c401fa51aa724a06c872381575cfa8d1c35a7ac9448852bf7df4f24a1"} Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.019674 4749 generic.go:334] "Generic (PLEG): container finished" podID="99e09f46-5b73-4300-8633-2af8a358e21f" containerID="992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486" exitCode=0 Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.019743 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4gtq" event={"ID":"99e09f46-5b73-4300-8633-2af8a358e21f","Type":"ContainerDied","Data":"992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486"} Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.019786 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4gtq" event={"ID":"99e09f46-5b73-4300-8633-2af8a358e21f","Type":"ContainerDied","Data":"e1e990153cfca1ab56e3c3b5c9a89ded30a219dd27dfe630c4b197615e3c1fc3"} Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.019820 4749 scope.go:117] "RemoveContainer" containerID="992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486" Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.020085 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4gtq" Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.193914 4749 scope.go:117] "RemoveContainer" containerID="de5ffe8efd0f014828ab3815a23ec62384998c40954a5f319b6b7aad40bdfaa3" Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.232078 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4gtq"] Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.239672 4749 scope.go:117] "RemoveContainer" containerID="52711501b2c32ad093f7e4361e931c81ff00aec8619aef8172739a9f93c16e57" Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.239950 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k4gtq"] Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.269416 4749 scope.go:117] "RemoveContainer" containerID="992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486" Oct 01 13:30:46 crc kubenswrapper[4749]: E1001 13:30:46.270658 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486\": container with ID starting with 992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486 not found: ID does not exist" containerID="992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486" Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.270688 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486"} err="failed to get container status \"992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486\": rpc error: code = NotFound desc = could not find container \"992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486\": container with ID starting with 992547bd559aca4484e8114bcaf6d932d933bffed4e0dca51026847d7bbca486 not found: ID does not exist" Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.270708 4749 scope.go:117] "RemoveContainer" containerID="de5ffe8efd0f014828ab3815a23ec62384998c40954a5f319b6b7aad40bdfaa3" Oct 01 13:30:46 crc kubenswrapper[4749]: E1001 13:30:46.271119 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5ffe8efd0f014828ab3815a23ec62384998c40954a5f319b6b7aad40bdfaa3\": container with ID starting with de5ffe8efd0f014828ab3815a23ec62384998c40954a5f319b6b7aad40bdfaa3 not found: ID does not exist" containerID="de5ffe8efd0f014828ab3815a23ec62384998c40954a5f319b6b7aad40bdfaa3" Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.271278 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5ffe8efd0f014828ab3815a23ec62384998c40954a5f319b6b7aad40bdfaa3"} err="failed to get container status \"de5ffe8efd0f014828ab3815a23ec62384998c40954a5f319b6b7aad40bdfaa3\": rpc error: code = NotFound desc = could not find container \"de5ffe8efd0f014828ab3815a23ec62384998c40954a5f319b6b7aad40bdfaa3\": container with ID starting with de5ffe8efd0f014828ab3815a23ec62384998c40954a5f319b6b7aad40bdfaa3 not found: ID does not exist" Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.271436 4749 scope.go:117] "RemoveContainer" containerID="52711501b2c32ad093f7e4361e931c81ff00aec8619aef8172739a9f93c16e57" Oct 01 13:30:46 crc kubenswrapper[4749]: E1001 13:30:46.271946 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52711501b2c32ad093f7e4361e931c81ff00aec8619aef8172739a9f93c16e57\": container with ID starting with 52711501b2c32ad093f7e4361e931c81ff00aec8619aef8172739a9f93c16e57 not found: ID does not exist" containerID="52711501b2c32ad093f7e4361e931c81ff00aec8619aef8172739a9f93c16e57" Oct 01 13:30:46 crc kubenswrapper[4749]: I1001 13:30:46.272151 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52711501b2c32ad093f7e4361e931c81ff00aec8619aef8172739a9f93c16e57"} err="failed to get container status \"52711501b2c32ad093f7e4361e931c81ff00aec8619aef8172739a9f93c16e57\": rpc error: code = NotFound desc = could not find container \"52711501b2c32ad093f7e4361e931c81ff00aec8619aef8172739a9f93c16e57\": container with ID starting with 52711501b2c32ad093f7e4361e931c81ff00aec8619aef8172739a9f93c16e57 not found: ID does not exist" Oct 01 13:30:47 crc kubenswrapper[4749]: I1001 13:30:47.035810 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" event={"ID":"36105d3f-3305-4cd8-9b9c-4b3d7eaec504","Type":"ContainerStarted","Data":"fba415e8aa50104caa044fa2288ef3eab895b462a0df3fc7b967d98e8e172ceb"} Oct 01 13:30:47 crc kubenswrapper[4749]: I1001 13:30:47.066843 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" podStartSLOduration=2.129732871 podStartE2EDuration="3.066814713s" podCreationTimestamp="2025-10-01 13:30:44 +0000 UTC" firstStartedPulling="2025-10-01 13:30:45.039759145 +0000 UTC m=+1505.093744044" lastFinishedPulling="2025-10-01 13:30:45.976840947 +0000 UTC m=+1506.030825886" observedRunningTime="2025-10-01 13:30:47.058599692 +0000 UTC m=+1507.112584651" watchObservedRunningTime="2025-10-01 13:30:47.066814713 +0000 UTC m=+1507.120799632" Oct 01 13:30:47 crc kubenswrapper[4749]: I1001 13:30:47.245834 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e09f46-5b73-4300-8633-2af8a358e21f" path="/var/lib/kubelet/pods/99e09f46-5b73-4300-8633-2af8a358e21f/volumes" Oct 01 13:31:02 crc kubenswrapper[4749]: I1001 13:31:02.106019 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:31:02 crc kubenswrapper[4749]: I1001 13:31:02.106718 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:31:02 crc kubenswrapper[4749]: I1001 13:31:02.106777 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:31:02 crc kubenswrapper[4749]: I1001 13:31:02.107816 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:31:02 crc kubenswrapper[4749]: I1001 13:31:02.107927 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" gracePeriod=600 Oct 01 13:31:02 crc kubenswrapper[4749]: E1001 13:31:02.230551 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:31:03 crc kubenswrapper[4749]: I1001 13:31:03.251622 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" exitCode=0 Oct 01 13:31:03 crc kubenswrapper[4749]: I1001 13:31:03.256757 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56"} Oct 01 13:31:03 crc kubenswrapper[4749]: I1001 13:31:03.256845 4749 scope.go:117] "RemoveContainer" containerID="f591f132880451f6e2a795c1ad995a4e9513c1a6eef56b3898e6a9f77eb8baef" Oct 01 13:31:03 crc kubenswrapper[4749]: I1001 13:31:03.257691 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:31:03 crc kubenswrapper[4749]: E1001 13:31:03.258211 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:31:06 crc kubenswrapper[4749]: I1001 13:31:06.338851 4749 scope.go:117] "RemoveContainer" containerID="98997b25feac3e5bc45d4b2eec6d4f29a9a5d254396383fb8679da4d25a6a337" Oct 01 13:31:06 crc kubenswrapper[4749]: I1001 13:31:06.375476 4749 scope.go:117] "RemoveContainer" containerID="7e8f952e99a5fa6d951aa22fd3b1ae6372b3b49ec2747a5b680d1990793a70bb" Oct 01 13:31:06 crc kubenswrapper[4749]: I1001 13:31:06.416657 4749 scope.go:117] "RemoveContainer" containerID="c35baff8f2f004747b4989fb62000677180bec81bb0d98f91dfbfa4b9749847b" Oct 01 13:31:06 crc kubenswrapper[4749]: I1001 13:31:06.465206 4749 scope.go:117] "RemoveContainer" containerID="0e9422034261039d929208334748b5eeee073307340b5829b59c177ef52b029c" Oct 01 13:31:06 crc kubenswrapper[4749]: I1001 13:31:06.518180 4749 scope.go:117] "RemoveContainer" containerID="d8145983aaf7b8e562fd6c9dcf52eb6183bcc88850615e2454d908616a11b3ee" Oct 01 13:31:17 crc kubenswrapper[4749]: I1001 13:31:17.230704 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:31:17 crc kubenswrapper[4749]: E1001 13:31:17.231890 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:31:28 crc kubenswrapper[4749]: I1001 13:31:28.230499 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:31:28 crc kubenswrapper[4749]: E1001 13:31:28.231261 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:31:41 crc kubenswrapper[4749]: I1001 13:31:41.243399 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:31:41 crc kubenswrapper[4749]: E1001 13:31:41.244434 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:31:55 crc kubenswrapper[4749]: I1001 13:31:55.230704 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:31:55 crc kubenswrapper[4749]: E1001 13:31:55.234424 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:32:07 crc kubenswrapper[4749]: I1001 13:32:07.229763 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:32:07 crc kubenswrapper[4749]: E1001 13:32:07.230525 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:32:19 crc kubenswrapper[4749]: I1001 13:32:19.230974 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:32:19 crc kubenswrapper[4749]: E1001 13:32:19.232367 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.621339 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n7tdk"] Oct 01 13:32:30 crc kubenswrapper[4749]: E1001 13:32:30.622859 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e09f46-5b73-4300-8633-2af8a358e21f" containerName="registry-server" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.623031 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e09f46-5b73-4300-8633-2af8a358e21f" containerName="registry-server" Oct 01 13:32:30 crc kubenswrapper[4749]: E1001 13:32:30.623055 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e09f46-5b73-4300-8633-2af8a358e21f" containerName="extract-content" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.623067 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e09f46-5b73-4300-8633-2af8a358e21f" containerName="extract-content" Oct 01 13:32:30 crc kubenswrapper[4749]: E1001 13:32:30.623093 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e09f46-5b73-4300-8633-2af8a358e21f" containerName="extract-utilities" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.623106 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e09f46-5b73-4300-8633-2af8a358e21f" containerName="extract-utilities" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.623454 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e09f46-5b73-4300-8633-2af8a358e21f" containerName="registry-server" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.651087 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.699694 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n7tdk"] Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.763526 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a30ad4-57d2-42a2-b612-a0412064601d-utilities\") pod \"community-operators-n7tdk\" (UID: \"c4a30ad4-57d2-42a2-b612-a0412064601d\") " pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.763789 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a30ad4-57d2-42a2-b612-a0412064601d-catalog-content\") pod \"community-operators-n7tdk\" (UID: \"c4a30ad4-57d2-42a2-b612-a0412064601d\") " pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.763837 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw2x7\" (UniqueName: \"kubernetes.io/projected/c4a30ad4-57d2-42a2-b612-a0412064601d-kube-api-access-xw2x7\") pod \"community-operators-n7tdk\" (UID: \"c4a30ad4-57d2-42a2-b612-a0412064601d\") " pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.865521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a30ad4-57d2-42a2-b612-a0412064601d-catalog-content\") pod \"community-operators-n7tdk\" (UID: \"c4a30ad4-57d2-42a2-b612-a0412064601d\") " pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.865588 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw2x7\" (UniqueName: \"kubernetes.io/projected/c4a30ad4-57d2-42a2-b612-a0412064601d-kube-api-access-xw2x7\") pod \"community-operators-n7tdk\" (UID: \"c4a30ad4-57d2-42a2-b612-a0412064601d\") " pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.865649 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a30ad4-57d2-42a2-b612-a0412064601d-utilities\") pod \"community-operators-n7tdk\" (UID: \"c4a30ad4-57d2-42a2-b612-a0412064601d\") " pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.866362 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a30ad4-57d2-42a2-b612-a0412064601d-utilities\") pod \"community-operators-n7tdk\" (UID: \"c4a30ad4-57d2-42a2-b612-a0412064601d\") " pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.866649 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a30ad4-57d2-42a2-b612-a0412064601d-catalog-content\") pod \"community-operators-n7tdk\" (UID: \"c4a30ad4-57d2-42a2-b612-a0412064601d\") " pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:30 crc kubenswrapper[4749]: I1001 13:32:30.892324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw2x7\" (UniqueName: \"kubernetes.io/projected/c4a30ad4-57d2-42a2-b612-a0412064601d-kube-api-access-xw2x7\") pod \"community-operators-n7tdk\" (UID: \"c4a30ad4-57d2-42a2-b612-a0412064601d\") " pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:31 crc kubenswrapper[4749]: I1001 13:32:31.023649 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:31 crc kubenswrapper[4749]: I1001 13:32:31.240268 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:32:31 crc kubenswrapper[4749]: E1001 13:32:31.241252 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:32:31 crc kubenswrapper[4749]: I1001 13:32:31.509611 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n7tdk"] Oct 01 13:32:32 crc kubenswrapper[4749]: I1001 13:32:32.382882 4749 generic.go:334] "Generic (PLEG): container finished" podID="c4a30ad4-57d2-42a2-b612-a0412064601d" containerID="1d9b5c3fe0b4ef8613fc66183b44fc173e546b5c4e376b081abdb6cb1185bcdf" exitCode=0 Oct 01 13:32:32 crc kubenswrapper[4749]: I1001 13:32:32.382927 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7tdk" event={"ID":"c4a30ad4-57d2-42a2-b612-a0412064601d","Type":"ContainerDied","Data":"1d9b5c3fe0b4ef8613fc66183b44fc173e546b5c4e376b081abdb6cb1185bcdf"} Oct 01 13:32:32 crc kubenswrapper[4749]: I1001 13:32:32.382958 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7tdk" event={"ID":"c4a30ad4-57d2-42a2-b612-a0412064601d","Type":"ContainerStarted","Data":"09f1bde7edd6aa1ec8148d5c362eba195ba09ae05bf6e4b4f2d3d24e9c095c9a"} Oct 01 13:32:32 crc kubenswrapper[4749]: I1001 13:32:32.385927 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:32:34 crc kubenswrapper[4749]: I1001 13:32:34.412440 4749 generic.go:334] "Generic (PLEG): container finished" podID="c4a30ad4-57d2-42a2-b612-a0412064601d" containerID="6cbce4e38bb556a93586195dcdd5fcfd2ffb933d877ad8f8a1074e3280652fa2" exitCode=0 Oct 01 13:32:34 crc kubenswrapper[4749]: I1001 13:32:34.412589 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7tdk" event={"ID":"c4a30ad4-57d2-42a2-b612-a0412064601d","Type":"ContainerDied","Data":"6cbce4e38bb556a93586195dcdd5fcfd2ffb933d877ad8f8a1074e3280652fa2"} Oct 01 13:32:36 crc kubenswrapper[4749]: I1001 13:32:36.435427 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7tdk" event={"ID":"c4a30ad4-57d2-42a2-b612-a0412064601d","Type":"ContainerStarted","Data":"96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac"} Oct 01 13:32:36 crc kubenswrapper[4749]: I1001 13:32:36.458635 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n7tdk" podStartSLOduration=3.553321674 podStartE2EDuration="6.458613621s" podCreationTimestamp="2025-10-01 13:32:30 +0000 UTC" firstStartedPulling="2025-10-01 13:32:32.385588293 +0000 UTC m=+1612.439573202" lastFinishedPulling="2025-10-01 13:32:35.29088025 +0000 UTC m=+1615.344865149" observedRunningTime="2025-10-01 13:32:36.45303821 +0000 UTC m=+1616.507023109" watchObservedRunningTime="2025-10-01 13:32:36.458613621 +0000 UTC m=+1616.512598520" Oct 01 13:32:41 crc kubenswrapper[4749]: I1001 13:32:41.024445 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:41 crc kubenswrapper[4749]: I1001 13:32:41.024716 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:41 crc kubenswrapper[4749]: I1001 13:32:41.089321 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:41 crc kubenswrapper[4749]: I1001 13:32:41.574176 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:41 crc kubenswrapper[4749]: I1001 13:32:41.634717 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n7tdk"] Oct 01 13:32:43 crc kubenswrapper[4749]: I1001 13:32:43.519287 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n7tdk" podUID="c4a30ad4-57d2-42a2-b612-a0412064601d" containerName="registry-server" containerID="cri-o://96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac" gracePeriod=2 Oct 01 13:32:43 crc kubenswrapper[4749]: I1001 13:32:43.991666 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.068853 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a30ad4-57d2-42a2-b612-a0412064601d-catalog-content\") pod \"c4a30ad4-57d2-42a2-b612-a0412064601d\" (UID: \"c4a30ad4-57d2-42a2-b612-a0412064601d\") " Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.069234 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw2x7\" (UniqueName: \"kubernetes.io/projected/c4a30ad4-57d2-42a2-b612-a0412064601d-kube-api-access-xw2x7\") pod \"c4a30ad4-57d2-42a2-b612-a0412064601d\" (UID: \"c4a30ad4-57d2-42a2-b612-a0412064601d\") " Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.069359 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a30ad4-57d2-42a2-b612-a0412064601d-utilities\") pod \"c4a30ad4-57d2-42a2-b612-a0412064601d\" (UID: \"c4a30ad4-57d2-42a2-b612-a0412064601d\") " Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.072276 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4a30ad4-57d2-42a2-b612-a0412064601d-utilities" (OuterVolumeSpecName: "utilities") pod "c4a30ad4-57d2-42a2-b612-a0412064601d" (UID: "c4a30ad4-57d2-42a2-b612-a0412064601d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.091660 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a30ad4-57d2-42a2-b612-a0412064601d-kube-api-access-xw2x7" (OuterVolumeSpecName: "kube-api-access-xw2x7") pod "c4a30ad4-57d2-42a2-b612-a0412064601d" (UID: "c4a30ad4-57d2-42a2-b612-a0412064601d"). InnerVolumeSpecName "kube-api-access-xw2x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.161952 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4a30ad4-57d2-42a2-b612-a0412064601d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4a30ad4-57d2-42a2-b612-a0412064601d" (UID: "c4a30ad4-57d2-42a2-b612-a0412064601d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.171243 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a30ad4-57d2-42a2-b612-a0412064601d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.171425 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw2x7\" (UniqueName: \"kubernetes.io/projected/c4a30ad4-57d2-42a2-b612-a0412064601d-kube-api-access-xw2x7\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.171488 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a30ad4-57d2-42a2-b612-a0412064601d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.230607 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:32:44 crc kubenswrapper[4749]: E1001 13:32:44.230995 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.542436 4749 generic.go:334] "Generic (PLEG): container finished" podID="c4a30ad4-57d2-42a2-b612-a0412064601d" containerID="96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac" exitCode=0 Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.542538 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n7tdk" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.542568 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7tdk" event={"ID":"c4a30ad4-57d2-42a2-b612-a0412064601d","Type":"ContainerDied","Data":"96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac"} Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.543185 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n7tdk" event={"ID":"c4a30ad4-57d2-42a2-b612-a0412064601d","Type":"ContainerDied","Data":"09f1bde7edd6aa1ec8148d5c362eba195ba09ae05bf6e4b4f2d3d24e9c095c9a"} Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.543235 4749 scope.go:117] "RemoveContainer" containerID="96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.605554 4749 scope.go:117] "RemoveContainer" containerID="6cbce4e38bb556a93586195dcdd5fcfd2ffb933d877ad8f8a1074e3280652fa2" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.617255 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n7tdk"] Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.633361 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n7tdk"] Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.651738 4749 scope.go:117] "RemoveContainer" containerID="1d9b5c3fe0b4ef8613fc66183b44fc173e546b5c4e376b081abdb6cb1185bcdf" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.726701 4749 scope.go:117] "RemoveContainer" containerID="96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac" Oct 01 13:32:44 crc kubenswrapper[4749]: E1001 13:32:44.731770 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac\": container with ID starting with 96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac not found: ID does not exist" containerID="96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.731861 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac"} err="failed to get container status \"96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac\": rpc error: code = NotFound desc = could not find container \"96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac\": container with ID starting with 96daf00692cca328bcd5acaeedee193d2e320d396cb237ffc07eca117a4fcfac not found: ID does not exist" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.731907 4749 scope.go:117] "RemoveContainer" containerID="6cbce4e38bb556a93586195dcdd5fcfd2ffb933d877ad8f8a1074e3280652fa2" Oct 01 13:32:44 crc kubenswrapper[4749]: E1001 13:32:44.732560 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cbce4e38bb556a93586195dcdd5fcfd2ffb933d877ad8f8a1074e3280652fa2\": container with ID starting with 6cbce4e38bb556a93586195dcdd5fcfd2ffb933d877ad8f8a1074e3280652fa2 not found: ID does not exist" containerID="6cbce4e38bb556a93586195dcdd5fcfd2ffb933d877ad8f8a1074e3280652fa2" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.732640 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbce4e38bb556a93586195dcdd5fcfd2ffb933d877ad8f8a1074e3280652fa2"} err="failed to get container status \"6cbce4e38bb556a93586195dcdd5fcfd2ffb933d877ad8f8a1074e3280652fa2\": rpc error: code = NotFound desc = could not find container \"6cbce4e38bb556a93586195dcdd5fcfd2ffb933d877ad8f8a1074e3280652fa2\": container with ID starting with 6cbce4e38bb556a93586195dcdd5fcfd2ffb933d877ad8f8a1074e3280652fa2 not found: ID does not exist" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.732690 4749 scope.go:117] "RemoveContainer" containerID="1d9b5c3fe0b4ef8613fc66183b44fc173e546b5c4e376b081abdb6cb1185bcdf" Oct 01 13:32:44 crc kubenswrapper[4749]: E1001 13:32:44.733387 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d9b5c3fe0b4ef8613fc66183b44fc173e546b5c4e376b081abdb6cb1185bcdf\": container with ID starting with 1d9b5c3fe0b4ef8613fc66183b44fc173e546b5c4e376b081abdb6cb1185bcdf not found: ID does not exist" containerID="1d9b5c3fe0b4ef8613fc66183b44fc173e546b5c4e376b081abdb6cb1185bcdf" Oct 01 13:32:44 crc kubenswrapper[4749]: I1001 13:32:44.733441 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d9b5c3fe0b4ef8613fc66183b44fc173e546b5c4e376b081abdb6cb1185bcdf"} err="failed to get container status \"1d9b5c3fe0b4ef8613fc66183b44fc173e546b5c4e376b081abdb6cb1185bcdf\": rpc error: code = NotFound desc = could not find container \"1d9b5c3fe0b4ef8613fc66183b44fc173e546b5c4e376b081abdb6cb1185bcdf\": container with ID starting with 1d9b5c3fe0b4ef8613fc66183b44fc173e546b5c4e376b081abdb6cb1185bcdf not found: ID does not exist" Oct 01 13:32:45 crc kubenswrapper[4749]: I1001 13:32:45.245521 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a30ad4-57d2-42a2-b612-a0412064601d" path="/var/lib/kubelet/pods/c4a30ad4-57d2-42a2-b612-a0412064601d/volumes" Oct 01 13:32:57 crc kubenswrapper[4749]: I1001 13:32:57.232205 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:32:57 crc kubenswrapper[4749]: E1001 13:32:57.233087 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:33:08 crc kubenswrapper[4749]: I1001 13:33:08.230885 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:33:08 crc kubenswrapper[4749]: E1001 13:33:08.231847 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:33:19 crc kubenswrapper[4749]: I1001 13:33:19.229964 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:33:19 crc kubenswrapper[4749]: E1001 13:33:19.230626 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:33:34 crc kubenswrapper[4749]: I1001 13:33:34.229681 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:33:34 crc kubenswrapper[4749]: E1001 13:33:34.231665 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:33:47 crc kubenswrapper[4749]: I1001 13:33:47.230724 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:33:47 crc kubenswrapper[4749]: E1001 13:33:47.232272 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:33:58 crc kubenswrapper[4749]: I1001 13:33:58.230653 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:33:58 crc kubenswrapper[4749]: E1001 13:33:58.232170 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:34:04 crc kubenswrapper[4749]: I1001 13:34:04.045177 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-z2mtw"] Oct 01 13:34:04 crc kubenswrapper[4749]: I1001 13:34:04.058355 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-jrmhv"] Oct 01 13:34:04 crc kubenswrapper[4749]: I1001 13:34:04.067664 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-g8hvl"] Oct 01 13:34:04 crc kubenswrapper[4749]: I1001 13:34:04.076821 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-z2mtw"] Oct 01 13:34:04 crc kubenswrapper[4749]: I1001 13:34:04.085899 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-bc6w2"] Oct 01 13:34:04 crc kubenswrapper[4749]: I1001 13:34:04.094575 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-jrmhv"] Oct 01 13:34:04 crc kubenswrapper[4749]: I1001 13:34:04.103377 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-g8hvl"] Oct 01 13:34:04 crc kubenswrapper[4749]: I1001 13:34:04.112416 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-bc6w2"] Oct 01 13:34:04 crc kubenswrapper[4749]: I1001 13:34:04.493119 4749 generic.go:334] "Generic (PLEG): container finished" podID="36105d3f-3305-4cd8-9b9c-4b3d7eaec504" containerID="fba415e8aa50104caa044fa2288ef3eab895b462a0df3fc7b967d98e8e172ceb" exitCode=0 Oct 01 13:34:04 crc kubenswrapper[4749]: I1001 13:34:04.493246 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" event={"ID":"36105d3f-3305-4cd8-9b9c-4b3d7eaec504","Type":"ContainerDied","Data":"fba415e8aa50104caa044fa2288ef3eab895b462a0df3fc7b967d98e8e172ceb"} Oct 01 13:34:05 crc kubenswrapper[4749]: I1001 13:34:05.247497 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="061adc28-9cad-4d25-8245-906926bc7509" path="/var/lib/kubelet/pods/061adc28-9cad-4d25-8245-906926bc7509/volumes" Oct 01 13:34:05 crc kubenswrapper[4749]: I1001 13:34:05.248188 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45fc3ae7-532e-4a89-91e7-7bcd8ff37a19" path="/var/lib/kubelet/pods/45fc3ae7-532e-4a89-91e7-7bcd8ff37a19/volumes" Oct 01 13:34:05 crc kubenswrapper[4749]: I1001 13:34:05.248799 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c596b48d-fdb6-4b00-96cb-a276d563900f" path="/var/lib/kubelet/pods/c596b48d-fdb6-4b00-96cb-a276d563900f/volumes" Oct 01 13:34:05 crc kubenswrapper[4749]: I1001 13:34:05.249485 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d72dd5-fa0f-48f9-9845-4dbccc6bd156" path="/var/lib/kubelet/pods/e6d72dd5-fa0f-48f9-9845-4dbccc6bd156/volumes" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.003304 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.094781 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzg2h\" (UniqueName: \"kubernetes.io/projected/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-kube-api-access-kzg2h\") pod \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.094889 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-inventory\") pod \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.094964 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-ssh-key\") pod \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.095020 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-bootstrap-combined-ca-bundle\") pod \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\" (UID: \"36105d3f-3305-4cd8-9b9c-4b3d7eaec504\") " Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.100733 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "36105d3f-3305-4cd8-9b9c-4b3d7eaec504" (UID: "36105d3f-3305-4cd8-9b9c-4b3d7eaec504"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.101403 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-kube-api-access-kzg2h" (OuterVolumeSpecName: "kube-api-access-kzg2h") pod "36105d3f-3305-4cd8-9b9c-4b3d7eaec504" (UID: "36105d3f-3305-4cd8-9b9c-4b3d7eaec504"). InnerVolumeSpecName "kube-api-access-kzg2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.122513 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "36105d3f-3305-4cd8-9b9c-4b3d7eaec504" (UID: "36105d3f-3305-4cd8-9b9c-4b3d7eaec504"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.123314 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-inventory" (OuterVolumeSpecName: "inventory") pod "36105d3f-3305-4cd8-9b9c-4b3d7eaec504" (UID: "36105d3f-3305-4cd8-9b9c-4b3d7eaec504"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.196918 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.197267 4749 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.197283 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzg2h\" (UniqueName: \"kubernetes.io/projected/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-kube-api-access-kzg2h\") on node \"crc\" DevicePath \"\"" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.197296 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36105d3f-3305-4cd8-9b9c-4b3d7eaec504-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.514428 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" event={"ID":"36105d3f-3305-4cd8-9b9c-4b3d7eaec504","Type":"ContainerDied","Data":"1bbae58c401fa51aa724a06c872381575cfa8d1c35a7ac9448852bf7df4f24a1"} Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.514483 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bbae58c401fa51aa724a06c872381575cfa8d1c35a7ac9448852bf7df4f24a1" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.515352 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.635481 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn"] Oct 01 13:34:06 crc kubenswrapper[4749]: E1001 13:34:06.635988 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a30ad4-57d2-42a2-b612-a0412064601d" containerName="extract-utilities" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.636010 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a30ad4-57d2-42a2-b612-a0412064601d" containerName="extract-utilities" Oct 01 13:34:06 crc kubenswrapper[4749]: E1001 13:34:06.636028 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a30ad4-57d2-42a2-b612-a0412064601d" containerName="registry-server" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.636037 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a30ad4-57d2-42a2-b612-a0412064601d" containerName="registry-server" Oct 01 13:34:06 crc kubenswrapper[4749]: E1001 13:34:06.636054 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a30ad4-57d2-42a2-b612-a0412064601d" containerName="extract-content" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.636063 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a30ad4-57d2-42a2-b612-a0412064601d" containerName="extract-content" Oct 01 13:34:06 crc kubenswrapper[4749]: E1001 13:34:06.636087 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36105d3f-3305-4cd8-9b9c-4b3d7eaec504" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.636098 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="36105d3f-3305-4cd8-9b9c-4b3d7eaec504" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.636398 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="36105d3f-3305-4cd8-9b9c-4b3d7eaec504" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.636412 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a30ad4-57d2-42a2-b612-a0412064601d" containerName="registry-server" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.637199 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.641034 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.642263 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.642448 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.644620 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.648492 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn"] Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.708209 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/505c57d6-8e3e-469e-b7ed-c15bdff56519-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn\" (UID: \"505c57d6-8e3e-469e-b7ed-c15bdff56519\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.708539 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/505c57d6-8e3e-469e-b7ed-c15bdff56519-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn\" (UID: \"505c57d6-8e3e-469e-b7ed-c15bdff56519\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.708651 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcvhz\" (UniqueName: \"kubernetes.io/projected/505c57d6-8e3e-469e-b7ed-c15bdff56519-kube-api-access-gcvhz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn\" (UID: \"505c57d6-8e3e-469e-b7ed-c15bdff56519\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.795743 4749 scope.go:117] "RemoveContainer" containerID="3a80bdbcf49494f38dabca41e55665eb196d8b62b317fce61237d90d3c4ab196" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.810703 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/505c57d6-8e3e-469e-b7ed-c15bdff56519-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn\" (UID: \"505c57d6-8e3e-469e-b7ed-c15bdff56519\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.810785 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/505c57d6-8e3e-469e-b7ed-c15bdff56519-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn\" (UID: \"505c57d6-8e3e-469e-b7ed-c15bdff56519\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.810836 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcvhz\" (UniqueName: \"kubernetes.io/projected/505c57d6-8e3e-469e-b7ed-c15bdff56519-kube-api-access-gcvhz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn\" (UID: \"505c57d6-8e3e-469e-b7ed-c15bdff56519\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.815032 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/505c57d6-8e3e-469e-b7ed-c15bdff56519-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn\" (UID: \"505c57d6-8e3e-469e-b7ed-c15bdff56519\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.815398 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/505c57d6-8e3e-469e-b7ed-c15bdff56519-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn\" (UID: \"505c57d6-8e3e-469e-b7ed-c15bdff56519\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.820546 4749 scope.go:117] "RemoveContainer" containerID="b13f73df7466a8e5676c158f45a06cb7dac5a47ecd4a682620d91d56ebe4f727" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.830570 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcvhz\" (UniqueName: \"kubernetes.io/projected/505c57d6-8e3e-469e-b7ed-c15bdff56519-kube-api-access-gcvhz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn\" (UID: \"505c57d6-8e3e-469e-b7ed-c15bdff56519\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.883960 4749 scope.go:117] "RemoveContainer" containerID="dfe15272485bedc38157dcb6fc75040d770b1171f9480c598a69acc81d118e80" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.909536 4749 scope.go:117] "RemoveContainer" containerID="dff2cb79a21e89a17daf55ea1a475794c553ba6165c3ce83afce82e6a46be43e" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.967534 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:34:06 crc kubenswrapper[4749]: I1001 13:34:06.993361 4749 scope.go:117] "RemoveContainer" containerID="e698349bb482e78848b0954f08cf804bb2012efbd15bb9f4663316a7a52451b9" Oct 01 13:34:07 crc kubenswrapper[4749]: I1001 13:34:07.183000 4749 scope.go:117] "RemoveContainer" containerID="5568a633dbc33307256d1285560ced24873c6788d15fe1702c024bdaa886400e" Oct 01 13:34:07 crc kubenswrapper[4749]: I1001 13:34:07.579689 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn"] Oct 01 13:34:08 crc kubenswrapper[4749]: I1001 13:34:08.533877 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" event={"ID":"505c57d6-8e3e-469e-b7ed-c15bdff56519","Type":"ContainerStarted","Data":"5feadcc4d4570c461ea0e0563413f0caaf745507bf1b17332b97944638a4bead"} Oct 01 13:34:09 crc kubenswrapper[4749]: I1001 13:34:09.038879 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-46df-account-create-lm8pz"] Oct 01 13:34:09 crc kubenswrapper[4749]: I1001 13:34:09.065193 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-46df-account-create-lm8pz"] Oct 01 13:34:09 crc kubenswrapper[4749]: I1001 13:34:09.231548 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:34:09 crc kubenswrapper[4749]: E1001 13:34:09.232158 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:34:09 crc kubenswrapper[4749]: I1001 13:34:09.244667 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a108a313-0fd8-41bc-9fbf-a86a003fe066" path="/var/lib/kubelet/pods/a108a313-0fd8-41bc-9fbf-a86a003fe066/volumes" Oct 01 13:34:09 crc kubenswrapper[4749]: I1001 13:34:09.547691 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" event={"ID":"505c57d6-8e3e-469e-b7ed-c15bdff56519","Type":"ContainerStarted","Data":"16f896eefdd777f32e048d092977443e8af37d728f4196c13d02644d3bef2bc0"} Oct 01 13:34:09 crc kubenswrapper[4749]: I1001 13:34:09.572564 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" podStartSLOduration=2.831331866 podStartE2EDuration="3.572544419s" podCreationTimestamp="2025-10-01 13:34:06 +0000 UTC" firstStartedPulling="2025-10-01 13:34:07.581279633 +0000 UTC m=+1707.635264542" lastFinishedPulling="2025-10-01 13:34:08.322492156 +0000 UTC m=+1708.376477095" observedRunningTime="2025-10-01 13:34:09.566412722 +0000 UTC m=+1709.620397661" watchObservedRunningTime="2025-10-01 13:34:09.572544419 +0000 UTC m=+1709.626529338" Oct 01 13:34:17 crc kubenswrapper[4749]: I1001 13:34:17.032449 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b604-account-create-8v4pd"] Oct 01 13:34:17 crc kubenswrapper[4749]: I1001 13:34:17.048063 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-687d-account-create-9ssdm"] Oct 01 13:34:17 crc kubenswrapper[4749]: I1001 13:34:17.075533 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b604-account-create-8v4pd"] Oct 01 13:34:17 crc kubenswrapper[4749]: I1001 13:34:17.091797 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-687d-account-create-9ssdm"] Oct 01 13:34:17 crc kubenswrapper[4749]: I1001 13:34:17.241143 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e090bf-b7d2-4f65-b8c0-147f61612834" path="/var/lib/kubelet/pods/85e090bf-b7d2-4f65-b8c0-147f61612834/volumes" Oct 01 13:34:17 crc kubenswrapper[4749]: I1001 13:34:17.241828 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71" path="/var/lib/kubelet/pods/f9ea6d66-83a3-4ab2-8e8c-8dbf8b861f71/volumes" Oct 01 13:34:20 crc kubenswrapper[4749]: I1001 13:34:20.230937 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:34:20 crc kubenswrapper[4749]: E1001 13:34:20.231458 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:34:23 crc kubenswrapper[4749]: I1001 13:34:23.030759 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1a64-account-create-f5wl6"] Oct 01 13:34:23 crc kubenswrapper[4749]: I1001 13:34:23.042649 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1a64-account-create-f5wl6"] Oct 01 13:34:23 crc kubenswrapper[4749]: I1001 13:34:23.242301 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc5f870d-1b2f-4cc2-93c7-6375aef31397" path="/var/lib/kubelet/pods/dc5f870d-1b2f-4cc2-93c7-6375aef31397/volumes" Oct 01 13:34:31 crc kubenswrapper[4749]: I1001 13:34:31.239597 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:34:31 crc kubenswrapper[4749]: E1001 13:34:31.240640 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:34:45 crc kubenswrapper[4749]: I1001 13:34:45.230263 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:34:45 crc kubenswrapper[4749]: E1001 13:34:45.231206 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:34:46 crc kubenswrapper[4749]: I1001 13:34:46.064812 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-5lxgd"] Oct 01 13:34:46 crc kubenswrapper[4749]: I1001 13:34:46.095185 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-5lxgd"] Oct 01 13:34:47 crc kubenswrapper[4749]: I1001 13:34:47.246862 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f26e2be9-292a-4e1d-8ef4-98c9d9989cb0" path="/var/lib/kubelet/pods/f26e2be9-292a-4e1d-8ef4-98c9d9989cb0/volumes" Oct 01 13:34:50 crc kubenswrapper[4749]: I1001 13:34:50.034133 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-rws79"] Oct 01 13:34:50 crc kubenswrapper[4749]: I1001 13:34:50.046120 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6vcqk"] Oct 01 13:34:50 crc kubenswrapper[4749]: I1001 13:34:50.056479 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-rws79"] Oct 01 13:34:50 crc kubenswrapper[4749]: I1001 13:34:50.064531 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6vcqk"] Oct 01 13:34:51 crc kubenswrapper[4749]: I1001 13:34:51.255717 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a817a5-ab51-4aa9-a26d-81f451d600d1" path="/var/lib/kubelet/pods/13a817a5-ab51-4aa9-a26d-81f451d600d1/volumes" Oct 01 13:34:51 crc kubenswrapper[4749]: I1001 13:34:51.256708 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2694945-782c-44b4-9613-4b5adb24c52f" path="/var/lib/kubelet/pods/a2694945-782c-44b4-9613-4b5adb24c52f/volumes" Oct 01 13:34:55 crc kubenswrapper[4749]: I1001 13:34:55.028879 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-8whfl"] Oct 01 13:34:55 crc kubenswrapper[4749]: I1001 13:34:55.041383 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-8whfl"] Oct 01 13:34:55 crc kubenswrapper[4749]: I1001 13:34:55.241684 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9896d0f7-92a6-46b0-88ce-b64b390998c5" path="/var/lib/kubelet/pods/9896d0f7-92a6-46b0-88ce-b64b390998c5/volumes" Oct 01 13:34:56 crc kubenswrapper[4749]: I1001 13:34:56.043946 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ff7d-account-create-wcnjd"] Oct 01 13:34:56 crc kubenswrapper[4749]: I1001 13:34:56.054740 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ff7d-account-create-wcnjd"] Oct 01 13:34:56 crc kubenswrapper[4749]: I1001 13:34:56.066992 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-mndqc"] Oct 01 13:34:56 crc kubenswrapper[4749]: I1001 13:34:56.074803 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-mndqc"] Oct 01 13:34:57 crc kubenswrapper[4749]: I1001 13:34:57.247181 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2" path="/var/lib/kubelet/pods/8fe6ed51-f8bc-4f1a-bae0-f2f6641b19a2/volumes" Oct 01 13:34:57 crc kubenswrapper[4749]: I1001 13:34:57.248633 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7bb7cd0-7579-44b5-bac2-ae93c122858a" path="/var/lib/kubelet/pods/f7bb7cd0-7579-44b5-bac2-ae93c122858a/volumes" Oct 01 13:34:59 crc kubenswrapper[4749]: I1001 13:34:59.229796 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:34:59 crc kubenswrapper[4749]: E1001 13:34:59.230315 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:35:07 crc kubenswrapper[4749]: I1001 13:35:07.370924 4749 scope.go:117] "RemoveContainer" containerID="dc75dd6f34881a4b422fb71e94f85f36204730c0de5864799d407d5ef016948a" Oct 01 13:35:07 crc kubenswrapper[4749]: I1001 13:35:07.403365 4749 scope.go:117] "RemoveContainer" containerID="15143a0978f9ddd502a12bb6b3e166c92a0b005fcfa606929bfc5f6f9f3753a5" Oct 01 13:35:07 crc kubenswrapper[4749]: I1001 13:35:07.448135 4749 scope.go:117] "RemoveContainer" containerID="b80ac3918b2c3cbc347792ef710b30513ad2f7f90512c91ee6b469b1520aee00" Oct 01 13:35:07 crc kubenswrapper[4749]: I1001 13:35:07.512347 4749 scope.go:117] "RemoveContainer" containerID="6720f9d321b381c7d988d12b9c1803ea121752f54b6405c028a939b068040d33" Oct 01 13:35:07 crc kubenswrapper[4749]: I1001 13:35:07.575165 4749 scope.go:117] "RemoveContainer" containerID="3d7ccdae9c6117608e6a377d7acb6b411bcb8eefe49a5cd79e68070783576383" Oct 01 13:35:07 crc kubenswrapper[4749]: I1001 13:35:07.619947 4749 scope.go:117] "RemoveContainer" containerID="29cf94d9da7b55be344b02f852233b488d6adacf32b0baf10734190b34ad340d" Oct 01 13:35:07 crc kubenswrapper[4749]: I1001 13:35:07.696873 4749 scope.go:117] "RemoveContainer" containerID="3f319d3ac0ef0dbe05163254438f315aac780f6ff7d92346c4031e079894a432" Oct 01 13:35:07 crc kubenswrapper[4749]: I1001 13:35:07.730508 4749 scope.go:117] "RemoveContainer" containerID="4bfdecd9965f22ae3d86a86b872d4d7a032dbdc8dc61e50b6185755a7540ca09" Oct 01 13:35:07 crc kubenswrapper[4749]: I1001 13:35:07.776036 4749 scope.go:117] "RemoveContainer" containerID="2f617727b338da067d00d361aabcf7196a049f8e26cbda97fea680dd621faca7" Oct 01 13:35:07 crc kubenswrapper[4749]: I1001 13:35:07.800117 4749 scope.go:117] "RemoveContainer" containerID="e934e7609fb2cefb489ea65c5f3c9dfab067f34d820cb1682e52b496f37f6621" Oct 01 13:35:10 crc kubenswrapper[4749]: I1001 13:35:10.230351 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:35:10 crc kubenswrapper[4749]: E1001 13:35:10.231350 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:35:23 crc kubenswrapper[4749]: I1001 13:35:23.230148 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:35:23 crc kubenswrapper[4749]: E1001 13:35:23.230940 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:35:24 crc kubenswrapper[4749]: I1001 13:35:24.043052 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0725-account-create-nxf2m"] Oct 01 13:35:24 crc kubenswrapper[4749]: I1001 13:35:24.055829 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4d06-account-create-rwqkp"] Oct 01 13:35:24 crc kubenswrapper[4749]: I1001 13:35:24.063673 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0725-account-create-nxf2m"] Oct 01 13:35:24 crc kubenswrapper[4749]: I1001 13:35:24.071812 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4d06-account-create-rwqkp"] Oct 01 13:35:25 crc kubenswrapper[4749]: I1001 13:35:25.251264 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48377824-164a-42bd-9374-f1fadf80ddf6" path="/var/lib/kubelet/pods/48377824-164a-42bd-9374-f1fadf80ddf6/volumes" Oct 01 13:35:25 crc kubenswrapper[4749]: I1001 13:35:25.252367 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75ae7232-a3d3-43dc-b003-f6e47b5e6868" path="/var/lib/kubelet/pods/75ae7232-a3d3-43dc-b003-f6e47b5e6868/volumes" Oct 01 13:35:28 crc kubenswrapper[4749]: I1001 13:35:28.031378 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-5j2w8"] Oct 01 13:35:28 crc kubenswrapper[4749]: I1001 13:35:28.039363 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8zmhd"] Oct 01 13:35:28 crc kubenswrapper[4749]: I1001 13:35:28.048750 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8zmhd"] Oct 01 13:35:28 crc kubenswrapper[4749]: I1001 13:35:28.059159 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-5j2w8"] Oct 01 13:35:29 crc kubenswrapper[4749]: I1001 13:35:29.240955 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ea02603-ed44-4faa-ae1e-37cf61162fde" path="/var/lib/kubelet/pods/9ea02603-ed44-4faa-ae1e-37cf61162fde/volumes" Oct 01 13:35:29 crc kubenswrapper[4749]: I1001 13:35:29.241556 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f32f512f-9aba-40b8-9f16-bcb1151eab3f" path="/var/lib/kubelet/pods/f32f512f-9aba-40b8-9f16-bcb1151eab3f/volumes" Oct 01 13:35:37 crc kubenswrapper[4749]: I1001 13:35:37.058815 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-d55n6"] Oct 01 13:35:37 crc kubenswrapper[4749]: I1001 13:35:37.074135 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-d55n6"] Oct 01 13:35:37 crc kubenswrapper[4749]: I1001 13:35:37.247272 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bcae573-48fa-4920-8b8f-4df57d4c5375" path="/var/lib/kubelet/pods/7bcae573-48fa-4920-8b8f-4df57d4c5375/volumes" Oct 01 13:35:38 crc kubenswrapper[4749]: I1001 13:35:38.230482 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:35:38 crc kubenswrapper[4749]: E1001 13:35:38.231077 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:35:49 crc kubenswrapper[4749]: I1001 13:35:49.046674 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-h459j"] Oct 01 13:35:49 crc kubenswrapper[4749]: I1001 13:35:49.059455 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-h459j"] Oct 01 13:35:49 crc kubenswrapper[4749]: I1001 13:35:49.251720 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f535ba4-1d6d-4103-8764-c324341bffdd" path="/var/lib/kubelet/pods/4f535ba4-1d6d-4103-8764-c324341bffdd/volumes" Oct 01 13:35:50 crc kubenswrapper[4749]: I1001 13:35:50.231821 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:35:50 crc kubenswrapper[4749]: E1001 13:35:50.232433 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:35:53 crc kubenswrapper[4749]: I1001 13:35:53.039782 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-6mlkv"] Oct 01 13:35:53 crc kubenswrapper[4749]: I1001 13:35:53.051599 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-6mlkv"] Oct 01 13:35:53 crc kubenswrapper[4749]: I1001 13:35:53.243793 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b316dc-4b38-4e40-bc7f-e8d64a9caa78" path="/var/lib/kubelet/pods/e3b316dc-4b38-4e40-bc7f-e8d64a9caa78/volumes" Oct 01 13:35:59 crc kubenswrapper[4749]: I1001 13:35:59.775426 4749 generic.go:334] "Generic (PLEG): container finished" podID="505c57d6-8e3e-469e-b7ed-c15bdff56519" containerID="16f896eefdd777f32e048d092977443e8af37d728f4196c13d02644d3bef2bc0" exitCode=0 Oct 01 13:35:59 crc kubenswrapper[4749]: I1001 13:35:59.775497 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" event={"ID":"505c57d6-8e3e-469e-b7ed-c15bdff56519","Type":"ContainerDied","Data":"16f896eefdd777f32e048d092977443e8af37d728f4196c13d02644d3bef2bc0"} Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.276061 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.287688 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/505c57d6-8e3e-469e-b7ed-c15bdff56519-inventory\") pod \"505c57d6-8e3e-469e-b7ed-c15bdff56519\" (UID: \"505c57d6-8e3e-469e-b7ed-c15bdff56519\") " Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.287857 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/505c57d6-8e3e-469e-b7ed-c15bdff56519-ssh-key\") pod \"505c57d6-8e3e-469e-b7ed-c15bdff56519\" (UID: \"505c57d6-8e3e-469e-b7ed-c15bdff56519\") " Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.288121 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcvhz\" (UniqueName: \"kubernetes.io/projected/505c57d6-8e3e-469e-b7ed-c15bdff56519-kube-api-access-gcvhz\") pod \"505c57d6-8e3e-469e-b7ed-c15bdff56519\" (UID: \"505c57d6-8e3e-469e-b7ed-c15bdff56519\") " Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.301904 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505c57d6-8e3e-469e-b7ed-c15bdff56519-kube-api-access-gcvhz" (OuterVolumeSpecName: "kube-api-access-gcvhz") pod "505c57d6-8e3e-469e-b7ed-c15bdff56519" (UID: "505c57d6-8e3e-469e-b7ed-c15bdff56519"). InnerVolumeSpecName "kube-api-access-gcvhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.332628 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/505c57d6-8e3e-469e-b7ed-c15bdff56519-inventory" (OuterVolumeSpecName: "inventory") pod "505c57d6-8e3e-469e-b7ed-c15bdff56519" (UID: "505c57d6-8e3e-469e-b7ed-c15bdff56519"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.336768 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/505c57d6-8e3e-469e-b7ed-c15bdff56519-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "505c57d6-8e3e-469e-b7ed-c15bdff56519" (UID: "505c57d6-8e3e-469e-b7ed-c15bdff56519"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.391177 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/505c57d6-8e3e-469e-b7ed-c15bdff56519-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.391236 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/505c57d6-8e3e-469e-b7ed-c15bdff56519-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.391250 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcvhz\" (UniqueName: \"kubernetes.io/projected/505c57d6-8e3e-469e-b7ed-c15bdff56519-kube-api-access-gcvhz\") on node \"crc\" DevicePath \"\"" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.805624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" event={"ID":"505c57d6-8e3e-469e-b7ed-c15bdff56519","Type":"ContainerDied","Data":"5feadcc4d4570c461ea0e0563413f0caaf745507bf1b17332b97944638a4bead"} Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.805678 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5feadcc4d4570c461ea0e0563413f0caaf745507bf1b17332b97944638a4bead" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.805779 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.948537 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx"] Oct 01 13:36:01 crc kubenswrapper[4749]: E1001 13:36:01.949070 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505c57d6-8e3e-469e-b7ed-c15bdff56519" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.949093 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="505c57d6-8e3e-469e-b7ed-c15bdff56519" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.949353 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="505c57d6-8e3e-469e-b7ed-c15bdff56519" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.950194 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.953201 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.955034 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.955157 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.959626 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx"] Oct 01 13:36:01 crc kubenswrapper[4749]: I1001 13:36:01.962372 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:36:02 crc kubenswrapper[4749]: I1001 13:36:02.004794 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f382a017-d5fe-45d9-ad7b-f9316dbd5834-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx\" (UID: \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:36:02 crc kubenswrapper[4749]: I1001 13:36:02.004863 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5zb9\" (UniqueName: \"kubernetes.io/projected/f382a017-d5fe-45d9-ad7b-f9316dbd5834-kube-api-access-d5zb9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx\" (UID: \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:36:02 crc kubenswrapper[4749]: I1001 13:36:02.004956 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f382a017-d5fe-45d9-ad7b-f9316dbd5834-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx\" (UID: \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:36:02 crc kubenswrapper[4749]: I1001 13:36:02.106505 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f382a017-d5fe-45d9-ad7b-f9316dbd5834-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx\" (UID: \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:36:02 crc kubenswrapper[4749]: I1001 13:36:02.106674 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5zb9\" (UniqueName: \"kubernetes.io/projected/f382a017-d5fe-45d9-ad7b-f9316dbd5834-kube-api-access-d5zb9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx\" (UID: \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:36:02 crc kubenswrapper[4749]: I1001 13:36:02.106736 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f382a017-d5fe-45d9-ad7b-f9316dbd5834-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx\" (UID: \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:36:02 crc kubenswrapper[4749]: I1001 13:36:02.110632 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f382a017-d5fe-45d9-ad7b-f9316dbd5834-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx\" (UID: \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:36:02 crc kubenswrapper[4749]: I1001 13:36:02.111275 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f382a017-d5fe-45d9-ad7b-f9316dbd5834-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx\" (UID: \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:36:02 crc kubenswrapper[4749]: I1001 13:36:02.124473 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5zb9\" (UniqueName: \"kubernetes.io/projected/f382a017-d5fe-45d9-ad7b-f9316dbd5834-kube-api-access-d5zb9\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx\" (UID: \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:36:02 crc kubenswrapper[4749]: I1001 13:36:02.276964 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:36:02 crc kubenswrapper[4749]: I1001 13:36:02.866620 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx"] Oct 01 13:36:03 crc kubenswrapper[4749]: I1001 13:36:03.230277 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:36:03 crc kubenswrapper[4749]: I1001 13:36:03.828243 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" event={"ID":"f382a017-d5fe-45d9-ad7b-f9316dbd5834","Type":"ContainerStarted","Data":"a77f466d0516e19eb2c792ea2d6e3fa5c5ffb3304449a3b9198d235b10d98865"} Oct 01 13:36:03 crc kubenswrapper[4749]: I1001 13:36:03.828577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" event={"ID":"f382a017-d5fe-45d9-ad7b-f9316dbd5834","Type":"ContainerStarted","Data":"f9bd459bfb7780c92a7238c49a397da7c44e1819403da082ce0bd2afc0c8c1ee"} Oct 01 13:36:03 crc kubenswrapper[4749]: I1001 13:36:03.831606 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"b8204bdc15b502545a9526a4a541f27a52efd45ad6646c2c13cfdd5b53e3e274"} Oct 01 13:36:03 crc kubenswrapper[4749]: I1001 13:36:03.855686 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" podStartSLOduration=2.211509715 podStartE2EDuration="2.855651056s" podCreationTimestamp="2025-10-01 13:36:01 +0000 UTC" firstStartedPulling="2025-10-01 13:36:02.8711827 +0000 UTC m=+1822.925167619" lastFinishedPulling="2025-10-01 13:36:03.515324061 +0000 UTC m=+1823.569308960" observedRunningTime="2025-10-01 13:36:03.844402493 +0000 UTC m=+1823.898387412" watchObservedRunningTime="2025-10-01 13:36:03.855651056 +0000 UTC m=+1823.909636005" Oct 01 13:36:08 crc kubenswrapper[4749]: I1001 13:36:08.084678 4749 scope.go:117] "RemoveContainer" containerID="3fb95f4cc56a0375e71e2b3c51326e8e57407bd5a387962adb5b31a1bd5478ad" Oct 01 13:36:08 crc kubenswrapper[4749]: I1001 13:36:08.155584 4749 scope.go:117] "RemoveContainer" containerID="5aefa383a53b9a4e514e0c59c7f8455b1c72f300ad38b83b47060ca18c70a2a1" Oct 01 13:36:08 crc kubenswrapper[4749]: I1001 13:36:08.190875 4749 scope.go:117] "RemoveContainer" containerID="f2c7a8ac184830fdaabe91a4deed389ee9f47f9f066b45e87c69b75bc57bce8e" Oct 01 13:36:08 crc kubenswrapper[4749]: I1001 13:36:08.255298 4749 scope.go:117] "RemoveContainer" containerID="1e2f818f18cc349d92509baf7334070dc169aafeec21bb02c2074589bdda24e3" Oct 01 13:36:08 crc kubenswrapper[4749]: I1001 13:36:08.318722 4749 scope.go:117] "RemoveContainer" containerID="13edf912c6e6920b991a543183e9f3e801bbc7c3b97b97a8a5c2bcf49253aeed" Oct 01 13:36:08 crc kubenswrapper[4749]: I1001 13:36:08.357064 4749 scope.go:117] "RemoveContainer" containerID="7ad4fe9694512426bf33f9e9fffb584b0eb79d993f651695d89fa084a7ae5d2c" Oct 01 13:36:08 crc kubenswrapper[4749]: I1001 13:36:08.439099 4749 scope.go:117] "RemoveContainer" containerID="810c58ee7255f52da3bbd535560e2957e9307c16dd19359a94c131804b3326aa" Oct 01 13:36:30 crc kubenswrapper[4749]: I1001 13:36:30.054404 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-ljzz5"] Oct 01 13:36:30 crc kubenswrapper[4749]: I1001 13:36:30.066420 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hghc5"] Oct 01 13:36:30 crc kubenswrapper[4749]: I1001 13:36:30.078999 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hghc5"] Oct 01 13:36:30 crc kubenswrapper[4749]: I1001 13:36:30.088576 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-d8mdh"] Oct 01 13:36:30 crc kubenswrapper[4749]: I1001 13:36:30.099286 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-d8mdh"] Oct 01 13:36:30 crc kubenswrapper[4749]: I1001 13:36:30.109860 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-ljzz5"] Oct 01 13:36:31 crc kubenswrapper[4749]: I1001 13:36:31.240462 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cf66c1c-176c-4671-9887-07295eb47200" path="/var/lib/kubelet/pods/0cf66c1c-176c-4671-9887-07295eb47200/volumes" Oct 01 13:36:31 crc kubenswrapper[4749]: I1001 13:36:31.241363 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6" path="/var/lib/kubelet/pods/3ab26a09-3ca5-4568-b097-b6c4cfa6c8b6/volumes" Oct 01 13:36:31 crc kubenswrapper[4749]: I1001 13:36:31.242036 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b1f0790-fd61-4075-b031-cd82fa151ab8" path="/var/lib/kubelet/pods/9b1f0790-fd61-4075-b031-cd82fa151ab8/volumes" Oct 01 13:36:39 crc kubenswrapper[4749]: I1001 13:36:39.052944 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5927-account-create-rtd62"] Oct 01 13:36:39 crc kubenswrapper[4749]: I1001 13:36:39.092781 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d9b5-account-create-6bqbd"] Oct 01 13:36:39 crc kubenswrapper[4749]: I1001 13:36:39.104936 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5927-account-create-rtd62"] Oct 01 13:36:39 crc kubenswrapper[4749]: I1001 13:36:39.116508 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d9b5-account-create-6bqbd"] Oct 01 13:36:39 crc kubenswrapper[4749]: I1001 13:36:39.244426 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ae0f3d-5137-4015-a7c3-f082f485f058" path="/var/lib/kubelet/pods/66ae0f3d-5137-4015-a7c3-f082f485f058/volumes" Oct 01 13:36:39 crc kubenswrapper[4749]: I1001 13:36:39.245420 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb1ec85-617c-4675-b366-68beb4f61f3a" path="/var/lib/kubelet/pods/bfb1ec85-617c-4675-b366-68beb4f61f3a/volumes" Oct 01 13:36:40 crc kubenswrapper[4749]: I1001 13:36:40.031088 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-tc29m"] Oct 01 13:36:40 crc kubenswrapper[4749]: I1001 13:36:40.040547 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4752-account-create-zkt6l"] Oct 01 13:36:40 crc kubenswrapper[4749]: I1001 13:36:40.049609 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4752-account-create-zkt6l"] Oct 01 13:36:40 crc kubenswrapper[4749]: I1001 13:36:40.058905 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-tc29m"] Oct 01 13:36:41 crc kubenswrapper[4749]: I1001 13:36:41.241973 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="688d40fd-4ccf-4228-9a52-e2bcdd3cf761" path="/var/lib/kubelet/pods/688d40fd-4ccf-4228-9a52-e2bcdd3cf761/volumes" Oct 01 13:36:41 crc kubenswrapper[4749]: I1001 13:36:41.242852 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9ead71f-c58e-4634-96f9-81c9b165e24c" path="/var/lib/kubelet/pods/d9ead71f-c58e-4634-96f9-81c9b165e24c/volumes" Oct 01 13:37:08 crc kubenswrapper[4749]: I1001 13:37:08.598384 4749 scope.go:117] "RemoveContainer" containerID="8c773bad80c2b088d7018ffd8c47d352e36fe02527abfd1df6a15c019082d2ea" Oct 01 13:37:08 crc kubenswrapper[4749]: I1001 13:37:08.630714 4749 scope.go:117] "RemoveContainer" containerID="23888105b7f6bee898bc2ccf61d14a10575290c1cc1ae4a8efda236744d121de" Oct 01 13:37:08 crc kubenswrapper[4749]: I1001 13:37:08.685031 4749 scope.go:117] "RemoveContainer" containerID="475a3b76cce60952b57dc53079f6ec6a54634920be542a8e493dd4b52df36f0f" Oct 01 13:37:08 crc kubenswrapper[4749]: I1001 13:37:08.728372 4749 scope.go:117] "RemoveContainer" containerID="ec82d5bfc32297ac60d77fde66e8e92e7ae2c1287ead785bfa8106988027fb00" Oct 01 13:37:08 crc kubenswrapper[4749]: I1001 13:37:08.783699 4749 scope.go:117] "RemoveContainer" containerID="772eb734a26324a9b5d89a79a2165220c78dcb2ac21997af68fd098ccdc9f490" Oct 01 13:37:08 crc kubenswrapper[4749]: I1001 13:37:08.816815 4749 scope.go:117] "RemoveContainer" containerID="3057dffa0bfb34dc4a25d3f4a9238802eb20368a4a8820bd259e900722093104" Oct 01 13:37:08 crc kubenswrapper[4749]: I1001 13:37:08.861337 4749 scope.go:117] "RemoveContainer" containerID="b61e5a52990785704207c03519092fd1407992b2785b3eb0df19e683d390801f" Oct 01 13:37:19 crc kubenswrapper[4749]: I1001 13:37:19.711084 4749 generic.go:334] "Generic (PLEG): container finished" podID="f382a017-d5fe-45d9-ad7b-f9316dbd5834" containerID="a77f466d0516e19eb2c792ea2d6e3fa5c5ffb3304449a3b9198d235b10d98865" exitCode=0 Oct 01 13:37:19 crc kubenswrapper[4749]: I1001 13:37:19.711257 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" event={"ID":"f382a017-d5fe-45d9-ad7b-f9316dbd5834","Type":"ContainerDied","Data":"a77f466d0516e19eb2c792ea2d6e3fa5c5ffb3304449a3b9198d235b10d98865"} Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.304134 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.414345 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f382a017-d5fe-45d9-ad7b-f9316dbd5834-ssh-key\") pod \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\" (UID: \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\") " Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.414414 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f382a017-d5fe-45d9-ad7b-f9316dbd5834-inventory\") pod \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\" (UID: \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\") " Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.414758 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5zb9\" (UniqueName: \"kubernetes.io/projected/f382a017-d5fe-45d9-ad7b-f9316dbd5834-kube-api-access-d5zb9\") pod \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\" (UID: \"f382a017-d5fe-45d9-ad7b-f9316dbd5834\") " Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.421454 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f382a017-d5fe-45d9-ad7b-f9316dbd5834-kube-api-access-d5zb9" (OuterVolumeSpecName: "kube-api-access-d5zb9") pod "f382a017-d5fe-45d9-ad7b-f9316dbd5834" (UID: "f382a017-d5fe-45d9-ad7b-f9316dbd5834"). InnerVolumeSpecName "kube-api-access-d5zb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.443371 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f382a017-d5fe-45d9-ad7b-f9316dbd5834-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f382a017-d5fe-45d9-ad7b-f9316dbd5834" (UID: "f382a017-d5fe-45d9-ad7b-f9316dbd5834"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.459871 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f382a017-d5fe-45d9-ad7b-f9316dbd5834-inventory" (OuterVolumeSpecName: "inventory") pod "f382a017-d5fe-45d9-ad7b-f9316dbd5834" (UID: "f382a017-d5fe-45d9-ad7b-f9316dbd5834"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.520800 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5zb9\" (UniqueName: \"kubernetes.io/projected/f382a017-d5fe-45d9-ad7b-f9316dbd5834-kube-api-access-d5zb9\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.520866 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f382a017-d5fe-45d9-ad7b-f9316dbd5834-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.520882 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f382a017-d5fe-45d9-ad7b-f9316dbd5834-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.762690 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" event={"ID":"f382a017-d5fe-45d9-ad7b-f9316dbd5834","Type":"ContainerDied","Data":"f9bd459bfb7780c92a7238c49a397da7c44e1819403da082ce0bd2afc0c8c1ee"} Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.762753 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9bd459bfb7780c92a7238c49a397da7c44e1819403da082ce0bd2afc0c8c1ee" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.762853 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.830935 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7"] Oct 01 13:37:21 crc kubenswrapper[4749]: E1001 13:37:21.831463 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f382a017-d5fe-45d9-ad7b-f9316dbd5834" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.831486 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f382a017-d5fe-45d9-ad7b-f9316dbd5834" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.831776 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f382a017-d5fe-45d9-ad7b-f9316dbd5834" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.832636 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.837417 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.837742 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.837950 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.838125 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.861000 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7"] Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.933261 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv97h\" (UniqueName: \"kubernetes.io/projected/36d63e13-8131-47f4-a65a-a78db593d3bf-kube-api-access-qv97h\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w88x7\" (UID: \"36d63e13-8131-47f4-a65a-a78db593d3bf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.933302 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36d63e13-8131-47f4-a65a-a78db593d3bf-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w88x7\" (UID: \"36d63e13-8131-47f4-a65a-a78db593d3bf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:21 crc kubenswrapper[4749]: I1001 13:37:21.933387 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36d63e13-8131-47f4-a65a-a78db593d3bf-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w88x7\" (UID: \"36d63e13-8131-47f4-a65a-a78db593d3bf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:22 crc kubenswrapper[4749]: I1001 13:37:22.036364 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv97h\" (UniqueName: \"kubernetes.io/projected/36d63e13-8131-47f4-a65a-a78db593d3bf-kube-api-access-qv97h\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w88x7\" (UID: \"36d63e13-8131-47f4-a65a-a78db593d3bf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:22 crc kubenswrapper[4749]: I1001 13:37:22.036448 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36d63e13-8131-47f4-a65a-a78db593d3bf-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w88x7\" (UID: \"36d63e13-8131-47f4-a65a-a78db593d3bf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:22 crc kubenswrapper[4749]: I1001 13:37:22.036646 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36d63e13-8131-47f4-a65a-a78db593d3bf-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w88x7\" (UID: \"36d63e13-8131-47f4-a65a-a78db593d3bf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:22 crc kubenswrapper[4749]: I1001 13:37:22.044480 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36d63e13-8131-47f4-a65a-a78db593d3bf-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w88x7\" (UID: \"36d63e13-8131-47f4-a65a-a78db593d3bf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:22 crc kubenswrapper[4749]: I1001 13:37:22.049199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36d63e13-8131-47f4-a65a-a78db593d3bf-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w88x7\" (UID: \"36d63e13-8131-47f4-a65a-a78db593d3bf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:22 crc kubenswrapper[4749]: I1001 13:37:22.054038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv97h\" (UniqueName: \"kubernetes.io/projected/36d63e13-8131-47f4-a65a-a78db593d3bf-kube-api-access-qv97h\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-w88x7\" (UID: \"36d63e13-8131-47f4-a65a-a78db593d3bf\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:22 crc kubenswrapper[4749]: I1001 13:37:22.163923 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:22 crc kubenswrapper[4749]: I1001 13:37:22.741672 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7"] Oct 01 13:37:22 crc kubenswrapper[4749]: I1001 13:37:22.776965 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" event={"ID":"36d63e13-8131-47f4-a65a-a78db593d3bf","Type":"ContainerStarted","Data":"10a7b9fc34ed2bbee831e49805573196c8bbf3ff313fc5b6f9321afddd90e69b"} Oct 01 13:37:24 crc kubenswrapper[4749]: I1001 13:37:24.805970 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" event={"ID":"36d63e13-8131-47f4-a65a-a78db593d3bf","Type":"ContainerStarted","Data":"a818f516bcc8894249f179e93ea79cd878a8b55ad2d64cae26813e45cd4a620d"} Oct 01 13:37:24 crc kubenswrapper[4749]: I1001 13:37:24.825144 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" podStartSLOduration=3.055665356 podStartE2EDuration="3.825124337s" podCreationTimestamp="2025-10-01 13:37:21 +0000 UTC" firstStartedPulling="2025-10-01 13:37:22.749867721 +0000 UTC m=+1902.803852620" lastFinishedPulling="2025-10-01 13:37:23.519326672 +0000 UTC m=+1903.573311601" observedRunningTime="2025-10-01 13:37:24.823669535 +0000 UTC m=+1904.877654494" watchObservedRunningTime="2025-10-01 13:37:24.825124337 +0000 UTC m=+1904.879109236" Oct 01 13:37:29 crc kubenswrapper[4749]: I1001 13:37:29.869450 4749 generic.go:334] "Generic (PLEG): container finished" podID="36d63e13-8131-47f4-a65a-a78db593d3bf" containerID="a818f516bcc8894249f179e93ea79cd878a8b55ad2d64cae26813e45cd4a620d" exitCode=0 Oct 01 13:37:29 crc kubenswrapper[4749]: I1001 13:37:29.869553 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" event={"ID":"36d63e13-8131-47f4-a65a-a78db593d3bf","Type":"ContainerDied","Data":"a818f516bcc8894249f179e93ea79cd878a8b55ad2d64cae26813e45cd4a620d"} Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.342899 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.443103 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36d63e13-8131-47f4-a65a-a78db593d3bf-ssh-key\") pod \"36d63e13-8131-47f4-a65a-a78db593d3bf\" (UID: \"36d63e13-8131-47f4-a65a-a78db593d3bf\") " Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.443205 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv97h\" (UniqueName: \"kubernetes.io/projected/36d63e13-8131-47f4-a65a-a78db593d3bf-kube-api-access-qv97h\") pod \"36d63e13-8131-47f4-a65a-a78db593d3bf\" (UID: \"36d63e13-8131-47f4-a65a-a78db593d3bf\") " Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.443355 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36d63e13-8131-47f4-a65a-a78db593d3bf-inventory\") pod \"36d63e13-8131-47f4-a65a-a78db593d3bf\" (UID: \"36d63e13-8131-47f4-a65a-a78db593d3bf\") " Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.449126 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d63e13-8131-47f4-a65a-a78db593d3bf-kube-api-access-qv97h" (OuterVolumeSpecName: "kube-api-access-qv97h") pod "36d63e13-8131-47f4-a65a-a78db593d3bf" (UID: "36d63e13-8131-47f4-a65a-a78db593d3bf"). InnerVolumeSpecName "kube-api-access-qv97h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.471421 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d63e13-8131-47f4-a65a-a78db593d3bf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "36d63e13-8131-47f4-a65a-a78db593d3bf" (UID: "36d63e13-8131-47f4-a65a-a78db593d3bf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.492701 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d63e13-8131-47f4-a65a-a78db593d3bf-inventory" (OuterVolumeSpecName: "inventory") pod "36d63e13-8131-47f4-a65a-a78db593d3bf" (UID: "36d63e13-8131-47f4-a65a-a78db593d3bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.546488 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36d63e13-8131-47f4-a65a-a78db593d3bf-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.546546 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv97h\" (UniqueName: \"kubernetes.io/projected/36d63e13-8131-47f4-a65a-a78db593d3bf-kube-api-access-qv97h\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.546614 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36d63e13-8131-47f4-a65a-a78db593d3bf-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.899319 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" event={"ID":"36d63e13-8131-47f4-a65a-a78db593d3bf","Type":"ContainerDied","Data":"10a7b9fc34ed2bbee831e49805573196c8bbf3ff313fc5b6f9321afddd90e69b"} Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.899368 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10a7b9fc34ed2bbee831e49805573196c8bbf3ff313fc5b6f9321afddd90e69b" Oct 01 13:37:31 crc kubenswrapper[4749]: I1001 13:37:31.899433 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-w88x7" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.014473 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf"] Oct 01 13:37:32 crc kubenswrapper[4749]: E1001 13:37:32.014964 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d63e13-8131-47f4-a65a-a78db593d3bf" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.014990 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d63e13-8131-47f4-a65a-a78db593d3bf" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.015285 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d63e13-8131-47f4-a65a-a78db593d3bf" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.016105 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.018699 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.018829 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.021640 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.026821 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.031931 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf"] Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.158759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrntr\" (UniqueName: \"kubernetes.io/projected/9ef2bd67-60d1-4f4b-893c-f7e22430addd-kube-api-access-nrntr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mmvsf\" (UID: \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.159076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef2bd67-60d1-4f4b-893c-f7e22430addd-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mmvsf\" (UID: \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.159186 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef2bd67-60d1-4f4b-893c-f7e22430addd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mmvsf\" (UID: \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.261052 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef2bd67-60d1-4f4b-893c-f7e22430addd-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mmvsf\" (UID: \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.261255 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef2bd67-60d1-4f4b-893c-f7e22430addd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mmvsf\" (UID: \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.261446 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrntr\" (UniqueName: \"kubernetes.io/projected/9ef2bd67-60d1-4f4b-893c-f7e22430addd-kube-api-access-nrntr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mmvsf\" (UID: \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.270932 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef2bd67-60d1-4f4b-893c-f7e22430addd-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mmvsf\" (UID: \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.280371 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef2bd67-60d1-4f4b-893c-f7e22430addd-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mmvsf\" (UID: \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.297466 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrntr\" (UniqueName: \"kubernetes.io/projected/9ef2bd67-60d1-4f4b-893c-f7e22430addd-kube-api-access-nrntr\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mmvsf\" (UID: \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.377414 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:37:32 crc kubenswrapper[4749]: I1001 13:37:32.987153 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf"] Oct 01 13:37:32 crc kubenswrapper[4749]: W1001 13:37:32.997121 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ef2bd67_60d1_4f4b_893c_f7e22430addd.slice/crio-590dbb5a9c5c9584e267a2c24c54f13aa09a5e0d5f7ca992eed556346c0e90ec WatchSource:0}: Error finding container 590dbb5a9c5c9584e267a2c24c54f13aa09a5e0d5f7ca992eed556346c0e90ec: Status 404 returned error can't find the container with id 590dbb5a9c5c9584e267a2c24c54f13aa09a5e0d5f7ca992eed556346c0e90ec Oct 01 13:37:33 crc kubenswrapper[4749]: I1001 13:37:33.001172 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:37:33 crc kubenswrapper[4749]: I1001 13:37:33.925511 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" event={"ID":"9ef2bd67-60d1-4f4b-893c-f7e22430addd","Type":"ContainerStarted","Data":"590dbb5a9c5c9584e267a2c24c54f13aa09a5e0d5f7ca992eed556346c0e90ec"} Oct 01 13:37:34 crc kubenswrapper[4749]: I1001 13:37:34.937002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" event={"ID":"9ef2bd67-60d1-4f4b-893c-f7e22430addd","Type":"ContainerStarted","Data":"cbf8e4ab4022f0ba02d24bb0e0c22e0870c23a8e22156c981327fcd9fcf5a908"} Oct 01 13:37:34 crc kubenswrapper[4749]: I1001 13:37:34.962074 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" podStartSLOduration=2.9196631440000003 podStartE2EDuration="3.962051195s" podCreationTimestamp="2025-10-01 13:37:31 +0000 UTC" firstStartedPulling="2025-10-01 13:37:33.000930587 +0000 UTC m=+1913.054915516" lastFinishedPulling="2025-10-01 13:37:34.043318648 +0000 UTC m=+1914.097303567" observedRunningTime="2025-10-01 13:37:34.957531425 +0000 UTC m=+1915.011516324" watchObservedRunningTime="2025-10-01 13:37:34.962051195 +0000 UTC m=+1915.016036094" Oct 01 13:37:35 crc kubenswrapper[4749]: I1001 13:37:35.044617 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ts66r"] Oct 01 13:37:35 crc kubenswrapper[4749]: I1001 13:37:35.052475 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ts66r"] Oct 01 13:37:35 crc kubenswrapper[4749]: I1001 13:37:35.244419 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a63bfe18-de5a-44e8-abef-17ee0f2af92a" path="/var/lib/kubelet/pods/a63bfe18-de5a-44e8-abef-17ee0f2af92a/volumes" Oct 01 13:37:59 crc kubenswrapper[4749]: I1001 13:37:59.040013 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-tkdsq"] Oct 01 13:37:59 crc kubenswrapper[4749]: I1001 13:37:59.063077 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-tkdsq"] Oct 01 13:37:59 crc kubenswrapper[4749]: I1001 13:37:59.246210 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dcfec60-454f-4182-b53d-280d182dee40" path="/var/lib/kubelet/pods/0dcfec60-454f-4182-b53d-280d182dee40/volumes" Oct 01 13:38:03 crc kubenswrapper[4749]: I1001 13:38:03.042021 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5b99w"] Oct 01 13:38:03 crc kubenswrapper[4749]: I1001 13:38:03.060984 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5b99w"] Oct 01 13:38:03 crc kubenswrapper[4749]: I1001 13:38:03.244054 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46286372-30ef-489e-8076-a65ad341d010" path="/var/lib/kubelet/pods/46286372-30ef-489e-8076-a65ad341d010/volumes" Oct 01 13:38:08 crc kubenswrapper[4749]: I1001 13:38:08.996127 4749 scope.go:117] "RemoveContainer" containerID="bc9e246cd99f270946e29c2281e7926f9b778fd8bcc539e8e67528d2e2d9d7e2" Oct 01 13:38:09 crc kubenswrapper[4749]: I1001 13:38:09.077569 4749 scope.go:117] "RemoveContainer" containerID="b0121f7354627d9fa574df9e28d73e13f44cdb1be6512aed3224e1dfb5cb8e07" Oct 01 13:38:09 crc kubenswrapper[4749]: I1001 13:38:09.124647 4749 scope.go:117] "RemoveContainer" containerID="369d9fd8645d6a719de992e7103cc6537fde49bf3dcf4f69544d82d1b2978f28" Oct 01 13:38:19 crc kubenswrapper[4749]: I1001 13:38:19.372894 4749 generic.go:334] "Generic (PLEG): container finished" podID="9ef2bd67-60d1-4f4b-893c-f7e22430addd" containerID="cbf8e4ab4022f0ba02d24bb0e0c22e0870c23a8e22156c981327fcd9fcf5a908" exitCode=0 Oct 01 13:38:19 crc kubenswrapper[4749]: I1001 13:38:19.373009 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" event={"ID":"9ef2bd67-60d1-4f4b-893c-f7e22430addd","Type":"ContainerDied","Data":"cbf8e4ab4022f0ba02d24bb0e0c22e0870c23a8e22156c981327fcd9fcf5a908"} Oct 01 13:38:20 crc kubenswrapper[4749]: I1001 13:38:20.972578 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.133015 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef2bd67-60d1-4f4b-893c-f7e22430addd-inventory\") pod \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\" (UID: \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\") " Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.133102 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrntr\" (UniqueName: \"kubernetes.io/projected/9ef2bd67-60d1-4f4b-893c-f7e22430addd-kube-api-access-nrntr\") pod \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\" (UID: \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\") " Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.133280 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef2bd67-60d1-4f4b-893c-f7e22430addd-ssh-key\") pod \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\" (UID: \"9ef2bd67-60d1-4f4b-893c-f7e22430addd\") " Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.142059 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef2bd67-60d1-4f4b-893c-f7e22430addd-kube-api-access-nrntr" (OuterVolumeSpecName: "kube-api-access-nrntr") pod "9ef2bd67-60d1-4f4b-893c-f7e22430addd" (UID: "9ef2bd67-60d1-4f4b-893c-f7e22430addd"). InnerVolumeSpecName "kube-api-access-nrntr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.172729 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef2bd67-60d1-4f4b-893c-f7e22430addd-inventory" (OuterVolumeSpecName: "inventory") pod "9ef2bd67-60d1-4f4b-893c-f7e22430addd" (UID: "9ef2bd67-60d1-4f4b-893c-f7e22430addd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.191357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef2bd67-60d1-4f4b-893c-f7e22430addd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9ef2bd67-60d1-4f4b-893c-f7e22430addd" (UID: "9ef2bd67-60d1-4f4b-893c-f7e22430addd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.236327 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef2bd67-60d1-4f4b-893c-f7e22430addd-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.236639 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrntr\" (UniqueName: \"kubernetes.io/projected/9ef2bd67-60d1-4f4b-893c-f7e22430addd-kube-api-access-nrntr\") on node \"crc\" DevicePath \"\"" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.236652 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ef2bd67-60d1-4f4b-893c-f7e22430addd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.401804 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" event={"ID":"9ef2bd67-60d1-4f4b-893c-f7e22430addd","Type":"ContainerDied","Data":"590dbb5a9c5c9584e267a2c24c54f13aa09a5e0d5f7ca992eed556346c0e90ec"} Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.401852 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="590dbb5a9c5c9584e267a2c24c54f13aa09a5e0d5f7ca992eed556346c0e90ec" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.401920 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mmvsf" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.515697 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf"] Oct 01 13:38:21 crc kubenswrapper[4749]: E1001 13:38:21.516523 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef2bd67-60d1-4f4b-893c-f7e22430addd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.516558 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef2bd67-60d1-4f4b-893c-f7e22430addd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.516913 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef2bd67-60d1-4f4b-893c-f7e22430addd" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.518267 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.522628 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.523000 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.523239 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.524009 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.528196 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf"] Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.644994 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtszb\" (UniqueName: \"kubernetes.io/projected/753c7ac8-1ca7-4787-af3b-87553f59bc9f-kube-api-access-jtszb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4scf\" (UID: \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.645085 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/753c7ac8-1ca7-4787-af3b-87553f59bc9f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4scf\" (UID: \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.645156 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/753c7ac8-1ca7-4787-af3b-87553f59bc9f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4scf\" (UID: \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.747040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/753c7ac8-1ca7-4787-af3b-87553f59bc9f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4scf\" (UID: \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.747400 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtszb\" (UniqueName: \"kubernetes.io/projected/753c7ac8-1ca7-4787-af3b-87553f59bc9f-kube-api-access-jtszb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4scf\" (UID: \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.747488 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/753c7ac8-1ca7-4787-af3b-87553f59bc9f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4scf\" (UID: \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.752933 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/753c7ac8-1ca7-4787-af3b-87553f59bc9f-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4scf\" (UID: \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.752949 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/753c7ac8-1ca7-4787-af3b-87553f59bc9f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4scf\" (UID: \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.772424 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtszb\" (UniqueName: \"kubernetes.io/projected/753c7ac8-1ca7-4787-af3b-87553f59bc9f-kube-api-access-jtszb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-q4scf\" (UID: \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:38:21 crc kubenswrapper[4749]: I1001 13:38:21.840756 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:38:23 crc kubenswrapper[4749]: I1001 13:38:23.213946 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf"] Oct 01 13:38:23 crc kubenswrapper[4749]: I1001 13:38:23.421692 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" event={"ID":"753c7ac8-1ca7-4787-af3b-87553f59bc9f","Type":"ContainerStarted","Data":"d9c7f1b55d042391b0459c284acd43c5e9845774c043497018cc37c8b0268acb"} Oct 01 13:38:24 crc kubenswrapper[4749]: I1001 13:38:24.432409 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" event={"ID":"753c7ac8-1ca7-4787-af3b-87553f59bc9f","Type":"ContainerStarted","Data":"a08eb7256e20c8d2f9db6480e935ae8c265d7fcf10da902cd7e6bf1e49046ebf"} Oct 01 13:38:25 crc kubenswrapper[4749]: I1001 13:38:25.475454 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" podStartSLOduration=3.58865014 podStartE2EDuration="4.475434411s" podCreationTimestamp="2025-10-01 13:38:21 +0000 UTC" firstStartedPulling="2025-10-01 13:38:23.239282926 +0000 UTC m=+1963.293267825" lastFinishedPulling="2025-10-01 13:38:24.126067167 +0000 UTC m=+1964.180052096" observedRunningTime="2025-10-01 13:38:25.467894276 +0000 UTC m=+1965.521879215" watchObservedRunningTime="2025-10-01 13:38:25.475434411 +0000 UTC m=+1965.529419310" Oct 01 13:38:32 crc kubenswrapper[4749]: I1001 13:38:32.106640 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:38:32 crc kubenswrapper[4749]: I1001 13:38:32.107209 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:38:43 crc kubenswrapper[4749]: I1001 13:38:43.039161 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-z7885"] Oct 01 13:38:43 crc kubenswrapper[4749]: I1001 13:38:43.050526 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-z7885"] Oct 01 13:38:43 crc kubenswrapper[4749]: I1001 13:38:43.243530 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b410813-5ad4-464d-a144-34501ae862c1" path="/var/lib/kubelet/pods/4b410813-5ad4-464d-a144-34501ae862c1/volumes" Oct 01 13:39:02 crc kubenswrapper[4749]: I1001 13:39:02.108593 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:39:02 crc kubenswrapper[4749]: I1001 13:39:02.110113 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:39:09 crc kubenswrapper[4749]: I1001 13:39:09.279850 4749 scope.go:117] "RemoveContainer" containerID="23d0d19b2b004bd8022fb1f8982fcb13b59edbbbe30ba9c98ad02f4458766cd1" Oct 01 13:39:25 crc kubenswrapper[4749]: I1001 13:39:25.136785 4749 generic.go:334] "Generic (PLEG): container finished" podID="753c7ac8-1ca7-4787-af3b-87553f59bc9f" containerID="a08eb7256e20c8d2f9db6480e935ae8c265d7fcf10da902cd7e6bf1e49046ebf" exitCode=2 Oct 01 13:39:25 crc kubenswrapper[4749]: I1001 13:39:25.137636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" event={"ID":"753c7ac8-1ca7-4787-af3b-87553f59bc9f","Type":"ContainerDied","Data":"a08eb7256e20c8d2f9db6480e935ae8c265d7fcf10da902cd7e6bf1e49046ebf"} Oct 01 13:39:26 crc kubenswrapper[4749]: I1001 13:39:26.649852 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:39:26 crc kubenswrapper[4749]: I1001 13:39:26.736032 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtszb\" (UniqueName: \"kubernetes.io/projected/753c7ac8-1ca7-4787-af3b-87553f59bc9f-kube-api-access-jtszb\") pod \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\" (UID: \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\") " Oct 01 13:39:26 crc kubenswrapper[4749]: I1001 13:39:26.736107 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/753c7ac8-1ca7-4787-af3b-87553f59bc9f-inventory\") pod \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\" (UID: \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\") " Oct 01 13:39:26 crc kubenswrapper[4749]: I1001 13:39:26.736181 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/753c7ac8-1ca7-4787-af3b-87553f59bc9f-ssh-key\") pod \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\" (UID: \"753c7ac8-1ca7-4787-af3b-87553f59bc9f\") " Oct 01 13:39:26 crc kubenswrapper[4749]: I1001 13:39:26.741700 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753c7ac8-1ca7-4787-af3b-87553f59bc9f-kube-api-access-jtszb" (OuterVolumeSpecName: "kube-api-access-jtszb") pod "753c7ac8-1ca7-4787-af3b-87553f59bc9f" (UID: "753c7ac8-1ca7-4787-af3b-87553f59bc9f"). InnerVolumeSpecName "kube-api-access-jtszb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:39:26 crc kubenswrapper[4749]: I1001 13:39:26.769775 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753c7ac8-1ca7-4787-af3b-87553f59bc9f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "753c7ac8-1ca7-4787-af3b-87553f59bc9f" (UID: "753c7ac8-1ca7-4787-af3b-87553f59bc9f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:39:26 crc kubenswrapper[4749]: I1001 13:39:26.776708 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/753c7ac8-1ca7-4787-af3b-87553f59bc9f-inventory" (OuterVolumeSpecName: "inventory") pod "753c7ac8-1ca7-4787-af3b-87553f59bc9f" (UID: "753c7ac8-1ca7-4787-af3b-87553f59bc9f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:39:26 crc kubenswrapper[4749]: I1001 13:39:26.839152 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtszb\" (UniqueName: \"kubernetes.io/projected/753c7ac8-1ca7-4787-af3b-87553f59bc9f-kube-api-access-jtszb\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:26 crc kubenswrapper[4749]: I1001 13:39:26.839188 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/753c7ac8-1ca7-4787-af3b-87553f59bc9f-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:26 crc kubenswrapper[4749]: I1001 13:39:26.839200 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/753c7ac8-1ca7-4787-af3b-87553f59bc9f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:39:27 crc kubenswrapper[4749]: I1001 13:39:27.159736 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" event={"ID":"753c7ac8-1ca7-4787-af3b-87553f59bc9f","Type":"ContainerDied","Data":"d9c7f1b55d042391b0459c284acd43c5e9845774c043497018cc37c8b0268acb"} Oct 01 13:39:27 crc kubenswrapper[4749]: I1001 13:39:27.159785 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9c7f1b55d042391b0459c284acd43c5e9845774c043497018cc37c8b0268acb" Oct 01 13:39:27 crc kubenswrapper[4749]: I1001 13:39:27.159800 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-q4scf" Oct 01 13:39:32 crc kubenswrapper[4749]: I1001 13:39:32.107077 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:39:32 crc kubenswrapper[4749]: I1001 13:39:32.107904 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:39:32 crc kubenswrapper[4749]: I1001 13:39:32.107978 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:39:32 crc kubenswrapper[4749]: I1001 13:39:32.109187 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8204bdc15b502545a9526a4a541f27a52efd45ad6646c2c13cfdd5b53e3e274"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:39:32 crc kubenswrapper[4749]: I1001 13:39:32.109383 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://b8204bdc15b502545a9526a4a541f27a52efd45ad6646c2c13cfdd5b53e3e274" gracePeriod=600 Oct 01 13:39:33 crc kubenswrapper[4749]: I1001 13:39:33.246620 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="b8204bdc15b502545a9526a4a541f27a52efd45ad6646c2c13cfdd5b53e3e274" exitCode=0 Oct 01 13:39:33 crc kubenswrapper[4749]: I1001 13:39:33.259901 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"b8204bdc15b502545a9526a4a541f27a52efd45ad6646c2c13cfdd5b53e3e274"} Oct 01 13:39:33 crc kubenswrapper[4749]: I1001 13:39:33.263548 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f"} Oct 01 13:39:33 crc kubenswrapper[4749]: I1001 13:39:33.263619 4749 scope.go:117] "RemoveContainer" containerID="13934ee4755b956e483b5357fb76386971613c02faac02ebc0213d6777eb2c56" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.038200 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t"] Oct 01 13:39:34 crc kubenswrapper[4749]: E1001 13:39:34.039111 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="753c7ac8-1ca7-4787-af3b-87553f59bc9f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.039190 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="753c7ac8-1ca7-4787-af3b-87553f59bc9f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.039459 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="753c7ac8-1ca7-4787-af3b-87553f59bc9f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.040456 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.043123 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.043209 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.043475 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.044643 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.048839 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t"] Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.101073 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqmwr\" (UniqueName: \"kubernetes.io/projected/45b8016b-ecf1-4187-98eb-daf846021c8c-kube-api-access-rqmwr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t\" (UID: \"45b8016b-ecf1-4187-98eb-daf846021c8c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.101419 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45b8016b-ecf1-4187-98eb-daf846021c8c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t\" (UID: \"45b8016b-ecf1-4187-98eb-daf846021c8c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.101708 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45b8016b-ecf1-4187-98eb-daf846021c8c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t\" (UID: \"45b8016b-ecf1-4187-98eb-daf846021c8c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.203251 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45b8016b-ecf1-4187-98eb-daf846021c8c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t\" (UID: \"45b8016b-ecf1-4187-98eb-daf846021c8c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.203384 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqmwr\" (UniqueName: \"kubernetes.io/projected/45b8016b-ecf1-4187-98eb-daf846021c8c-kube-api-access-rqmwr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t\" (UID: \"45b8016b-ecf1-4187-98eb-daf846021c8c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.203462 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45b8016b-ecf1-4187-98eb-daf846021c8c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t\" (UID: \"45b8016b-ecf1-4187-98eb-daf846021c8c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.210041 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45b8016b-ecf1-4187-98eb-daf846021c8c-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t\" (UID: \"45b8016b-ecf1-4187-98eb-daf846021c8c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.222370 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45b8016b-ecf1-4187-98eb-daf846021c8c-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t\" (UID: \"45b8016b-ecf1-4187-98eb-daf846021c8c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.223124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqmwr\" (UniqueName: \"kubernetes.io/projected/45b8016b-ecf1-4187-98eb-daf846021c8c-kube-api-access-rqmwr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t\" (UID: \"45b8016b-ecf1-4187-98eb-daf846021c8c\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.361576 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:39:34 crc kubenswrapper[4749]: I1001 13:39:34.909748 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t"] Oct 01 13:39:34 crc kubenswrapper[4749]: W1001 13:39:34.916022 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45b8016b_ecf1_4187_98eb_daf846021c8c.slice/crio-9af8935e2a6470826951df519bc5828b253dff74b2d28f9e247e70b4f583a0cc WatchSource:0}: Error finding container 9af8935e2a6470826951df519bc5828b253dff74b2d28f9e247e70b4f583a0cc: Status 404 returned error can't find the container with id 9af8935e2a6470826951df519bc5828b253dff74b2d28f9e247e70b4f583a0cc Oct 01 13:39:35 crc kubenswrapper[4749]: I1001 13:39:35.267715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" event={"ID":"45b8016b-ecf1-4187-98eb-daf846021c8c","Type":"ContainerStarted","Data":"9af8935e2a6470826951df519bc5828b253dff74b2d28f9e247e70b4f583a0cc"} Oct 01 13:39:36 crc kubenswrapper[4749]: I1001 13:39:36.285893 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" event={"ID":"45b8016b-ecf1-4187-98eb-daf846021c8c","Type":"ContainerStarted","Data":"be2314a38cb69fd2b07eb81d1e09e4d45d06c8d6f1e2bcc8a3b4d0e95b8a6c75"} Oct 01 13:39:36 crc kubenswrapper[4749]: I1001 13:39:36.317621 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" podStartSLOduration=1.6730507650000002 podStartE2EDuration="2.317595895s" podCreationTimestamp="2025-10-01 13:39:34 +0000 UTC" firstStartedPulling="2025-10-01 13:39:34.918344832 +0000 UTC m=+2034.972329731" lastFinishedPulling="2025-10-01 13:39:35.562889972 +0000 UTC m=+2035.616874861" observedRunningTime="2025-10-01 13:39:36.302635137 +0000 UTC m=+2036.356620066" watchObservedRunningTime="2025-10-01 13:39:36.317595895 +0000 UTC m=+2036.371580824" Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.006148 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zfqxn"] Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.009668 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.028402 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zfqxn"] Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.152863 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8ql6\" (UniqueName: \"kubernetes.io/projected/4268cdcd-8c98-4dfd-911f-0297aa37f02a-kube-api-access-n8ql6\") pod \"redhat-operators-zfqxn\" (UID: \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\") " pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.152935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4268cdcd-8c98-4dfd-911f-0297aa37f02a-utilities\") pod \"redhat-operators-zfqxn\" (UID: \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\") " pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.153068 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4268cdcd-8c98-4dfd-911f-0297aa37f02a-catalog-content\") pod \"redhat-operators-zfqxn\" (UID: \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\") " pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.255011 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8ql6\" (UniqueName: \"kubernetes.io/projected/4268cdcd-8c98-4dfd-911f-0297aa37f02a-kube-api-access-n8ql6\") pod \"redhat-operators-zfqxn\" (UID: \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\") " pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.255061 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4268cdcd-8c98-4dfd-911f-0297aa37f02a-utilities\") pod \"redhat-operators-zfqxn\" (UID: \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\") " pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.255088 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4268cdcd-8c98-4dfd-911f-0297aa37f02a-catalog-content\") pod \"redhat-operators-zfqxn\" (UID: \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\") " pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.255581 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4268cdcd-8c98-4dfd-911f-0297aa37f02a-utilities\") pod \"redhat-operators-zfqxn\" (UID: \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\") " pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.255688 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4268cdcd-8c98-4dfd-911f-0297aa37f02a-catalog-content\") pod \"redhat-operators-zfqxn\" (UID: \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\") " pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.273184 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8ql6\" (UniqueName: \"kubernetes.io/projected/4268cdcd-8c98-4dfd-911f-0297aa37f02a-kube-api-access-n8ql6\") pod \"redhat-operators-zfqxn\" (UID: \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\") " pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.352881 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:39:49 crc kubenswrapper[4749]: I1001 13:39:49.827985 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zfqxn"] Oct 01 13:39:50 crc kubenswrapper[4749]: I1001 13:39:50.435725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfqxn" event={"ID":"4268cdcd-8c98-4dfd-911f-0297aa37f02a","Type":"ContainerStarted","Data":"a13678b50ef2993aa5063de14c095ad266e645145a8cb7c9695627b5102e7c83"} Oct 01 13:39:51 crc kubenswrapper[4749]: I1001 13:39:51.446330 4749 generic.go:334] "Generic (PLEG): container finished" podID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" containerID="a7b46aed97a5c0a5225978cb650bc456129e23aefb910ff33ed9861a021fa187" exitCode=0 Oct 01 13:39:51 crc kubenswrapper[4749]: I1001 13:39:51.447388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfqxn" event={"ID":"4268cdcd-8c98-4dfd-911f-0297aa37f02a","Type":"ContainerDied","Data":"a7b46aed97a5c0a5225978cb650bc456129e23aefb910ff33ed9861a021fa187"} Oct 01 13:39:57 crc kubenswrapper[4749]: I1001 13:39:57.505372 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfqxn" event={"ID":"4268cdcd-8c98-4dfd-911f-0297aa37f02a","Type":"ContainerStarted","Data":"99bf880578b0de7067b363a1ee1ea756876406a443a5c4168c0de7e381c936d5"} Oct 01 13:39:58 crc kubenswrapper[4749]: I1001 13:39:58.516539 4749 generic.go:334] "Generic (PLEG): container finished" podID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" containerID="99bf880578b0de7067b363a1ee1ea756876406a443a5c4168c0de7e381c936d5" exitCode=0 Oct 01 13:39:58 crc kubenswrapper[4749]: I1001 13:39:58.516605 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfqxn" event={"ID":"4268cdcd-8c98-4dfd-911f-0297aa37f02a","Type":"ContainerDied","Data":"99bf880578b0de7067b363a1ee1ea756876406a443a5c4168c0de7e381c936d5"} Oct 01 13:40:05 crc kubenswrapper[4749]: I1001 13:40:05.637820 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfqxn" event={"ID":"4268cdcd-8c98-4dfd-911f-0297aa37f02a","Type":"ContainerStarted","Data":"b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843"} Oct 01 13:40:09 crc kubenswrapper[4749]: I1001 13:40:09.353727 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:40:09 crc kubenswrapper[4749]: I1001 13:40:09.354072 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:40:10 crc kubenswrapper[4749]: I1001 13:40:10.411949 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zfqxn" podUID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" containerName="registry-server" probeResult="failure" output=< Oct 01 13:40:10 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Oct 01 13:40:10 crc kubenswrapper[4749]: > Oct 01 13:40:19 crc kubenswrapper[4749]: I1001 13:40:19.397599 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:40:19 crc kubenswrapper[4749]: I1001 13:40:19.428593 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zfqxn" podStartSLOduration=17.557707337 podStartE2EDuration="31.428574942s" podCreationTimestamp="2025-10-01 13:39:48 +0000 UTC" firstStartedPulling="2025-10-01 13:39:51.448706585 +0000 UTC m=+2051.502691494" lastFinishedPulling="2025-10-01 13:40:05.31957416 +0000 UTC m=+2065.373559099" observedRunningTime="2025-10-01 13:40:05.665534008 +0000 UTC m=+2065.719518907" watchObservedRunningTime="2025-10-01 13:40:19.428574942 +0000 UTC m=+2079.482559841" Oct 01 13:40:19 crc kubenswrapper[4749]: I1001 13:40:19.451431 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:40:20 crc kubenswrapper[4749]: I1001 13:40:20.208305 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zfqxn"] Oct 01 13:40:20 crc kubenswrapper[4749]: I1001 13:40:20.822649 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zfqxn" podUID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" containerName="registry-server" containerID="cri-o://b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843" gracePeriod=2 Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.311244 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.475008 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8ql6\" (UniqueName: \"kubernetes.io/projected/4268cdcd-8c98-4dfd-911f-0297aa37f02a-kube-api-access-n8ql6\") pod \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\" (UID: \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\") " Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.475101 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4268cdcd-8c98-4dfd-911f-0297aa37f02a-catalog-content\") pod \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\" (UID: \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\") " Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.475178 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4268cdcd-8c98-4dfd-911f-0297aa37f02a-utilities\") pod \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\" (UID: \"4268cdcd-8c98-4dfd-911f-0297aa37f02a\") " Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.476579 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4268cdcd-8c98-4dfd-911f-0297aa37f02a-utilities" (OuterVolumeSpecName: "utilities") pod "4268cdcd-8c98-4dfd-911f-0297aa37f02a" (UID: "4268cdcd-8c98-4dfd-911f-0297aa37f02a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.480482 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4268cdcd-8c98-4dfd-911f-0297aa37f02a-kube-api-access-n8ql6" (OuterVolumeSpecName: "kube-api-access-n8ql6") pod "4268cdcd-8c98-4dfd-911f-0297aa37f02a" (UID: "4268cdcd-8c98-4dfd-911f-0297aa37f02a"). InnerVolumeSpecName "kube-api-access-n8ql6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.578009 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8ql6\" (UniqueName: \"kubernetes.io/projected/4268cdcd-8c98-4dfd-911f-0297aa37f02a-kube-api-access-n8ql6\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.578279 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4268cdcd-8c98-4dfd-911f-0297aa37f02a-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.614792 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4268cdcd-8c98-4dfd-911f-0297aa37f02a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4268cdcd-8c98-4dfd-911f-0297aa37f02a" (UID: "4268cdcd-8c98-4dfd-911f-0297aa37f02a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.680389 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4268cdcd-8c98-4dfd-911f-0297aa37f02a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.833869 4749 generic.go:334] "Generic (PLEG): container finished" podID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" containerID="b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843" exitCode=0 Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.833946 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfqxn" event={"ID":"4268cdcd-8c98-4dfd-911f-0297aa37f02a","Type":"ContainerDied","Data":"b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843"} Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.834039 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zfqxn" event={"ID":"4268cdcd-8c98-4dfd-911f-0297aa37f02a","Type":"ContainerDied","Data":"a13678b50ef2993aa5063de14c095ad266e645145a8cb7c9695627b5102e7c83"} Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.834083 4749 scope.go:117] "RemoveContainer" containerID="b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.835112 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zfqxn" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.877826 4749 scope.go:117] "RemoveContainer" containerID="99bf880578b0de7067b363a1ee1ea756876406a443a5c4168c0de7e381c936d5" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.878094 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zfqxn"] Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.886244 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zfqxn"] Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.904814 4749 scope.go:117] "RemoveContainer" containerID="a7b46aed97a5c0a5225978cb650bc456129e23aefb910ff33ed9861a021fa187" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.950036 4749 scope.go:117] "RemoveContainer" containerID="b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843" Oct 01 13:40:21 crc kubenswrapper[4749]: E1001 13:40:21.950620 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843\": container with ID starting with b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843 not found: ID does not exist" containerID="b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.950669 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843"} err="failed to get container status \"b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843\": rpc error: code = NotFound desc = could not find container \"b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843\": container with ID starting with b7a66d929e42aa4355d4fee30346925593c41c34429a7212a79dbdb360c93843 not found: ID does not exist" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.950702 4749 scope.go:117] "RemoveContainer" containerID="99bf880578b0de7067b363a1ee1ea756876406a443a5c4168c0de7e381c936d5" Oct 01 13:40:21 crc kubenswrapper[4749]: E1001 13:40:21.951161 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99bf880578b0de7067b363a1ee1ea756876406a443a5c4168c0de7e381c936d5\": container with ID starting with 99bf880578b0de7067b363a1ee1ea756876406a443a5c4168c0de7e381c936d5 not found: ID does not exist" containerID="99bf880578b0de7067b363a1ee1ea756876406a443a5c4168c0de7e381c936d5" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.951194 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99bf880578b0de7067b363a1ee1ea756876406a443a5c4168c0de7e381c936d5"} err="failed to get container status \"99bf880578b0de7067b363a1ee1ea756876406a443a5c4168c0de7e381c936d5\": rpc error: code = NotFound desc = could not find container \"99bf880578b0de7067b363a1ee1ea756876406a443a5c4168c0de7e381c936d5\": container with ID starting with 99bf880578b0de7067b363a1ee1ea756876406a443a5c4168c0de7e381c936d5 not found: ID does not exist" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.951211 4749 scope.go:117] "RemoveContainer" containerID="a7b46aed97a5c0a5225978cb650bc456129e23aefb910ff33ed9861a021fa187" Oct 01 13:40:21 crc kubenswrapper[4749]: E1001 13:40:21.951668 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b46aed97a5c0a5225978cb650bc456129e23aefb910ff33ed9861a021fa187\": container with ID starting with a7b46aed97a5c0a5225978cb650bc456129e23aefb910ff33ed9861a021fa187 not found: ID does not exist" containerID="a7b46aed97a5c0a5225978cb650bc456129e23aefb910ff33ed9861a021fa187" Oct 01 13:40:21 crc kubenswrapper[4749]: I1001 13:40:21.951729 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b46aed97a5c0a5225978cb650bc456129e23aefb910ff33ed9861a021fa187"} err="failed to get container status \"a7b46aed97a5c0a5225978cb650bc456129e23aefb910ff33ed9861a021fa187\": rpc error: code = NotFound desc = could not find container \"a7b46aed97a5c0a5225978cb650bc456129e23aefb910ff33ed9861a021fa187\": container with ID starting with a7b46aed97a5c0a5225978cb650bc456129e23aefb910ff33ed9861a021fa187 not found: ID does not exist" Oct 01 13:40:23 crc kubenswrapper[4749]: I1001 13:40:23.253233 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" path="/var/lib/kubelet/pods/4268cdcd-8c98-4dfd-911f-0297aa37f02a/volumes" Oct 01 13:40:32 crc kubenswrapper[4749]: I1001 13:40:32.955794 4749 generic.go:334] "Generic (PLEG): container finished" podID="45b8016b-ecf1-4187-98eb-daf846021c8c" containerID="be2314a38cb69fd2b07eb81d1e09e4d45d06c8d6f1e2bcc8a3b4d0e95b8a6c75" exitCode=0 Oct 01 13:40:32 crc kubenswrapper[4749]: I1001 13:40:32.956241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" event={"ID":"45b8016b-ecf1-4187-98eb-daf846021c8c","Type":"ContainerDied","Data":"be2314a38cb69fd2b07eb81d1e09e4d45d06c8d6f1e2bcc8a3b4d0e95b8a6c75"} Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.431451 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.561507 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45b8016b-ecf1-4187-98eb-daf846021c8c-inventory\") pod \"45b8016b-ecf1-4187-98eb-daf846021c8c\" (UID: \"45b8016b-ecf1-4187-98eb-daf846021c8c\") " Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.562073 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqmwr\" (UniqueName: \"kubernetes.io/projected/45b8016b-ecf1-4187-98eb-daf846021c8c-kube-api-access-rqmwr\") pod \"45b8016b-ecf1-4187-98eb-daf846021c8c\" (UID: \"45b8016b-ecf1-4187-98eb-daf846021c8c\") " Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.562178 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45b8016b-ecf1-4187-98eb-daf846021c8c-ssh-key\") pod \"45b8016b-ecf1-4187-98eb-daf846021c8c\" (UID: \"45b8016b-ecf1-4187-98eb-daf846021c8c\") " Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.567909 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b8016b-ecf1-4187-98eb-daf846021c8c-kube-api-access-rqmwr" (OuterVolumeSpecName: "kube-api-access-rqmwr") pod "45b8016b-ecf1-4187-98eb-daf846021c8c" (UID: "45b8016b-ecf1-4187-98eb-daf846021c8c"). InnerVolumeSpecName "kube-api-access-rqmwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.587019 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b8016b-ecf1-4187-98eb-daf846021c8c-inventory" (OuterVolumeSpecName: "inventory") pod "45b8016b-ecf1-4187-98eb-daf846021c8c" (UID: "45b8016b-ecf1-4187-98eb-daf846021c8c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.603967 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b8016b-ecf1-4187-98eb-daf846021c8c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "45b8016b-ecf1-4187-98eb-daf846021c8c" (UID: "45b8016b-ecf1-4187-98eb-daf846021c8c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.664442 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45b8016b-ecf1-4187-98eb-daf846021c8c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.664472 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45b8016b-ecf1-4187-98eb-daf846021c8c-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.664482 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqmwr\" (UniqueName: \"kubernetes.io/projected/45b8016b-ecf1-4187-98eb-daf846021c8c-kube-api-access-rqmwr\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.982069 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" event={"ID":"45b8016b-ecf1-4187-98eb-daf846021c8c","Type":"ContainerDied","Data":"9af8935e2a6470826951df519bc5828b253dff74b2d28f9e247e70b4f583a0cc"} Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.982117 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9af8935e2a6470826951df519bc5828b253dff74b2d28f9e247e70b4f583a0cc" Oct 01 13:40:34 crc kubenswrapper[4749]: I1001 13:40:34.982132 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.090775 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r4gm5"] Oct 01 13:40:35 crc kubenswrapper[4749]: E1001 13:40:35.091322 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" containerName="registry-server" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.091343 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" containerName="registry-server" Oct 01 13:40:35 crc kubenswrapper[4749]: E1001 13:40:35.091388 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" containerName="extract-content" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.091397 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" containerName="extract-content" Oct 01 13:40:35 crc kubenswrapper[4749]: E1001 13:40:35.091422 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b8016b-ecf1-4187-98eb-daf846021c8c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.091432 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b8016b-ecf1-4187-98eb-daf846021c8c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:40:35 crc kubenswrapper[4749]: E1001 13:40:35.091446 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" containerName="extract-utilities" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.091455 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" containerName="extract-utilities" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.091775 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4268cdcd-8c98-4dfd-911f-0297aa37f02a" containerName="registry-server" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.091801 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="45b8016b-ecf1-4187-98eb-daf846021c8c" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.092663 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.095018 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.095061 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.095610 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.096212 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.108419 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r4gm5"] Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.176939 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbhqm\" (UniqueName: \"kubernetes.io/projected/99ed982c-1039-47b4-b8f8-fcc9d06e636d-kube-api-access-cbhqm\") pod \"ssh-known-hosts-edpm-deployment-r4gm5\" (UID: \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\") " pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.177028 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99ed982c-1039-47b4-b8f8-fcc9d06e636d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r4gm5\" (UID: \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\") " pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.177055 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/99ed982c-1039-47b4-b8f8-fcc9d06e636d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r4gm5\" (UID: \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\") " pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.278796 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99ed982c-1039-47b4-b8f8-fcc9d06e636d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r4gm5\" (UID: \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\") " pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.278867 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/99ed982c-1039-47b4-b8f8-fcc9d06e636d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r4gm5\" (UID: \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\") " pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.279015 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbhqm\" (UniqueName: \"kubernetes.io/projected/99ed982c-1039-47b4-b8f8-fcc9d06e636d-kube-api-access-cbhqm\") pod \"ssh-known-hosts-edpm-deployment-r4gm5\" (UID: \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\") " pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.282927 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/99ed982c-1039-47b4-b8f8-fcc9d06e636d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r4gm5\" (UID: \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\") " pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.283245 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99ed982c-1039-47b4-b8f8-fcc9d06e636d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r4gm5\" (UID: \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\") " pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.301141 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbhqm\" (UniqueName: \"kubernetes.io/projected/99ed982c-1039-47b4-b8f8-fcc9d06e636d-kube-api-access-cbhqm\") pod \"ssh-known-hosts-edpm-deployment-r4gm5\" (UID: \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\") " pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.421140 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:35 crc kubenswrapper[4749]: I1001 13:40:35.991726 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r4gm5"] Oct 01 13:40:35 crc kubenswrapper[4749]: W1001 13:40:35.993031 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99ed982c_1039_47b4_b8f8_fcc9d06e636d.slice/crio-87c86518b06937ef8cd07d759984f5813f74cd5361f555d9eb2c4814602d22d8 WatchSource:0}: Error finding container 87c86518b06937ef8cd07d759984f5813f74cd5361f555d9eb2c4814602d22d8: Status 404 returned error can't find the container with id 87c86518b06937ef8cd07d759984f5813f74cd5361f555d9eb2c4814602d22d8 Oct 01 13:40:37 crc kubenswrapper[4749]: I1001 13:40:37.000738 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" event={"ID":"99ed982c-1039-47b4-b8f8-fcc9d06e636d","Type":"ContainerStarted","Data":"0933aaca048ae68c1ba4da5b2fbfab25e5f5416338279dc1750769be8122277f"} Oct 01 13:40:37 crc kubenswrapper[4749]: I1001 13:40:37.001829 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" event={"ID":"99ed982c-1039-47b4-b8f8-fcc9d06e636d","Type":"ContainerStarted","Data":"87c86518b06937ef8cd07d759984f5813f74cd5361f555d9eb2c4814602d22d8"} Oct 01 13:40:37 crc kubenswrapper[4749]: I1001 13:40:37.026068 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" podStartSLOduration=1.441898546 podStartE2EDuration="2.02604792s" podCreationTimestamp="2025-10-01 13:40:35 +0000 UTC" firstStartedPulling="2025-10-01 13:40:35.99636671 +0000 UTC m=+2096.050351609" lastFinishedPulling="2025-10-01 13:40:36.580516074 +0000 UTC m=+2096.634500983" observedRunningTime="2025-10-01 13:40:37.016909907 +0000 UTC m=+2097.070894806" watchObservedRunningTime="2025-10-01 13:40:37.02604792 +0000 UTC m=+2097.080032819" Oct 01 13:40:45 crc kubenswrapper[4749]: I1001 13:40:45.085983 4749 generic.go:334] "Generic (PLEG): container finished" podID="99ed982c-1039-47b4-b8f8-fcc9d06e636d" containerID="0933aaca048ae68c1ba4da5b2fbfab25e5f5416338279dc1750769be8122277f" exitCode=0 Oct 01 13:40:45 crc kubenswrapper[4749]: I1001 13:40:45.086129 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" event={"ID":"99ed982c-1039-47b4-b8f8-fcc9d06e636d","Type":"ContainerDied","Data":"0933aaca048ae68c1ba4da5b2fbfab25e5f5416338279dc1750769be8122277f"} Oct 01 13:40:46 crc kubenswrapper[4749]: I1001 13:40:46.590031 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:46 crc kubenswrapper[4749]: I1001 13:40:46.729309 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/99ed982c-1039-47b4-b8f8-fcc9d06e636d-inventory-0\") pod \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\" (UID: \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\") " Oct 01 13:40:46 crc kubenswrapper[4749]: I1001 13:40:46.729511 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99ed982c-1039-47b4-b8f8-fcc9d06e636d-ssh-key-openstack-edpm-ipam\") pod \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\" (UID: \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\") " Oct 01 13:40:46 crc kubenswrapper[4749]: I1001 13:40:46.729607 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbhqm\" (UniqueName: \"kubernetes.io/projected/99ed982c-1039-47b4-b8f8-fcc9d06e636d-kube-api-access-cbhqm\") pod \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\" (UID: \"99ed982c-1039-47b4-b8f8-fcc9d06e636d\") " Oct 01 13:40:46 crc kubenswrapper[4749]: I1001 13:40:46.737536 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ed982c-1039-47b4-b8f8-fcc9d06e636d-kube-api-access-cbhqm" (OuterVolumeSpecName: "kube-api-access-cbhqm") pod "99ed982c-1039-47b4-b8f8-fcc9d06e636d" (UID: "99ed982c-1039-47b4-b8f8-fcc9d06e636d"). InnerVolumeSpecName "kube-api-access-cbhqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:40:46 crc kubenswrapper[4749]: I1001 13:40:46.767505 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ed982c-1039-47b4-b8f8-fcc9d06e636d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "99ed982c-1039-47b4-b8f8-fcc9d06e636d" (UID: "99ed982c-1039-47b4-b8f8-fcc9d06e636d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:40:46 crc kubenswrapper[4749]: I1001 13:40:46.768775 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99ed982c-1039-47b4-b8f8-fcc9d06e636d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "99ed982c-1039-47b4-b8f8-fcc9d06e636d" (UID: "99ed982c-1039-47b4-b8f8-fcc9d06e636d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:40:46 crc kubenswrapper[4749]: I1001 13:40:46.836430 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/99ed982c-1039-47b4-b8f8-fcc9d06e636d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:46 crc kubenswrapper[4749]: I1001 13:40:46.836475 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbhqm\" (UniqueName: \"kubernetes.io/projected/99ed982c-1039-47b4-b8f8-fcc9d06e636d-kube-api-access-cbhqm\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:46 crc kubenswrapper[4749]: I1001 13:40:46.836499 4749 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/99ed982c-1039-47b4-b8f8-fcc9d06e636d-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.110900 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" event={"ID":"99ed982c-1039-47b4-b8f8-fcc9d06e636d","Type":"ContainerDied","Data":"87c86518b06937ef8cd07d759984f5813f74cd5361f555d9eb2c4814602d22d8"} Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.110957 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87c86518b06937ef8cd07d759984f5813f74cd5361f555d9eb2c4814602d22d8" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.110997 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r4gm5" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.200995 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6"] Oct 01 13:40:47 crc kubenswrapper[4749]: E1001 13:40:47.201639 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ed982c-1039-47b4-b8f8-fcc9d06e636d" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.201671 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ed982c-1039-47b4-b8f8-fcc9d06e636d" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.202085 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ed982c-1039-47b4-b8f8-fcc9d06e636d" containerName="ssh-known-hosts-edpm-deployment" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.203361 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.208301 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.209049 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.209101 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.209741 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.215571 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6"] Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.348178 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e51631e2-8bb9-4f43-958a-a3475d800d61-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s59r6\" (UID: \"e51631e2-8bb9-4f43-958a-a3475d800d61\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.348396 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51631e2-8bb9-4f43-958a-a3475d800d61-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s59r6\" (UID: \"e51631e2-8bb9-4f43-958a-a3475d800d61\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.348683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj44t\" (UniqueName: \"kubernetes.io/projected/e51631e2-8bb9-4f43-958a-a3475d800d61-kube-api-access-vj44t\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s59r6\" (UID: \"e51631e2-8bb9-4f43-958a-a3475d800d61\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.450454 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e51631e2-8bb9-4f43-958a-a3475d800d61-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s59r6\" (UID: \"e51631e2-8bb9-4f43-958a-a3475d800d61\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.450528 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51631e2-8bb9-4f43-958a-a3475d800d61-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s59r6\" (UID: \"e51631e2-8bb9-4f43-958a-a3475d800d61\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.450591 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj44t\" (UniqueName: \"kubernetes.io/projected/e51631e2-8bb9-4f43-958a-a3475d800d61-kube-api-access-vj44t\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s59r6\" (UID: \"e51631e2-8bb9-4f43-958a-a3475d800d61\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.462045 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51631e2-8bb9-4f43-958a-a3475d800d61-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s59r6\" (UID: \"e51631e2-8bb9-4f43-958a-a3475d800d61\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.464004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e51631e2-8bb9-4f43-958a-a3475d800d61-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s59r6\" (UID: \"e51631e2-8bb9-4f43-958a-a3475d800d61\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.490010 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj44t\" (UniqueName: \"kubernetes.io/projected/e51631e2-8bb9-4f43-958a-a3475d800d61-kube-api-access-vj44t\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-s59r6\" (UID: \"e51631e2-8bb9-4f43-958a-a3475d800d61\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:40:47 crc kubenswrapper[4749]: I1001 13:40:47.543317 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:40:48 crc kubenswrapper[4749]: I1001 13:40:48.126931 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6"] Oct 01 13:40:49 crc kubenswrapper[4749]: I1001 13:40:49.138816 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" event={"ID":"e51631e2-8bb9-4f43-958a-a3475d800d61","Type":"ContainerStarted","Data":"01ad7b502ac53ac33d441fdb7797c159e6200cc4836083cbf14c48ad1b00183b"} Oct 01 13:40:49 crc kubenswrapper[4749]: I1001 13:40:49.139302 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" event={"ID":"e51631e2-8bb9-4f43-958a-a3475d800d61","Type":"ContainerStarted","Data":"f95b5ed05ce2ba4d9eee76405d83c4f73911bba6c7e3b114c2e124ef4fd74eb1"} Oct 01 13:40:49 crc kubenswrapper[4749]: I1001 13:40:49.159673 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" podStartSLOduration=1.7140988579999998 podStartE2EDuration="2.159651693s" podCreationTimestamp="2025-10-01 13:40:47 +0000 UTC" firstStartedPulling="2025-10-01 13:40:48.133730693 +0000 UTC m=+2108.187715602" lastFinishedPulling="2025-10-01 13:40:48.579283538 +0000 UTC m=+2108.633268437" observedRunningTime="2025-10-01 13:40:49.152025534 +0000 UTC m=+2109.206010453" watchObservedRunningTime="2025-10-01 13:40:49.159651693 +0000 UTC m=+2109.213636592" Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.650453 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pg4vv"] Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.653634 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.684467 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pg4vv"] Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.770811 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-utilities\") pod \"certified-operators-pg4vv\" (UID: \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\") " pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.770884 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsrcb\" (UniqueName: \"kubernetes.io/projected/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-kube-api-access-xsrcb\") pod \"certified-operators-pg4vv\" (UID: \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\") " pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.770990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-catalog-content\") pod \"certified-operators-pg4vv\" (UID: \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\") " pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.873182 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-catalog-content\") pod \"certified-operators-pg4vv\" (UID: \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\") " pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.873515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-utilities\") pod \"certified-operators-pg4vv\" (UID: \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\") " pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.873641 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsrcb\" (UniqueName: \"kubernetes.io/projected/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-kube-api-access-xsrcb\") pod \"certified-operators-pg4vv\" (UID: \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\") " pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.873981 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-catalog-content\") pod \"certified-operators-pg4vv\" (UID: \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\") " pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.874103 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-utilities\") pod \"certified-operators-pg4vv\" (UID: \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\") " pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.894726 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsrcb\" (UniqueName: \"kubernetes.io/projected/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-kube-api-access-xsrcb\") pod \"certified-operators-pg4vv\" (UID: \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\") " pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:40:56 crc kubenswrapper[4749]: I1001 13:40:56.995125 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:40:57 crc kubenswrapper[4749]: I1001 13:40:57.540478 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pg4vv"] Oct 01 13:40:58 crc kubenswrapper[4749]: I1001 13:40:58.249352 4749 generic.go:334] "Generic (PLEG): container finished" podID="e51631e2-8bb9-4f43-958a-a3475d800d61" containerID="01ad7b502ac53ac33d441fdb7797c159e6200cc4836083cbf14c48ad1b00183b" exitCode=0 Oct 01 13:40:58 crc kubenswrapper[4749]: I1001 13:40:58.249482 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" event={"ID":"e51631e2-8bb9-4f43-958a-a3475d800d61","Type":"ContainerDied","Data":"01ad7b502ac53ac33d441fdb7797c159e6200cc4836083cbf14c48ad1b00183b"} Oct 01 13:40:58 crc kubenswrapper[4749]: I1001 13:40:58.252539 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9e9378-b175-4b01-af8a-4194e5fb7cd0" containerID="b9a12b956b7a7cd54018a51702248f47221452f28e5d521f0ca161e451ec6826" exitCode=0 Oct 01 13:40:58 crc kubenswrapper[4749]: I1001 13:40:58.252613 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg4vv" event={"ID":"8b9e9378-b175-4b01-af8a-4194e5fb7cd0","Type":"ContainerDied","Data":"b9a12b956b7a7cd54018a51702248f47221452f28e5d521f0ca161e451ec6826"} Oct 01 13:40:58 crc kubenswrapper[4749]: I1001 13:40:58.252659 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg4vv" event={"ID":"8b9e9378-b175-4b01-af8a-4194e5fb7cd0","Type":"ContainerStarted","Data":"ade3bb095d11c63f13afaa0f2b159ea940a2631a4b9dc0a63f2a6658d8121b48"} Oct 01 13:40:59 crc kubenswrapper[4749]: I1001 13:40:59.269849 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg4vv" event={"ID":"8b9e9378-b175-4b01-af8a-4194e5fb7cd0","Type":"ContainerStarted","Data":"6f1c72483a13a6e9749b27b937315ce65da13ea00906cf4be660a155bfab2f86"} Oct 01 13:40:59 crc kubenswrapper[4749]: I1001 13:40:59.783918 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:40:59 crc kubenswrapper[4749]: I1001 13:40:59.957955 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e51631e2-8bb9-4f43-958a-a3475d800d61-ssh-key\") pod \"e51631e2-8bb9-4f43-958a-a3475d800d61\" (UID: \"e51631e2-8bb9-4f43-958a-a3475d800d61\") " Oct 01 13:40:59 crc kubenswrapper[4749]: I1001 13:40:59.959010 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51631e2-8bb9-4f43-958a-a3475d800d61-inventory\") pod \"e51631e2-8bb9-4f43-958a-a3475d800d61\" (UID: \"e51631e2-8bb9-4f43-958a-a3475d800d61\") " Oct 01 13:40:59 crc kubenswrapper[4749]: I1001 13:40:59.959161 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj44t\" (UniqueName: \"kubernetes.io/projected/e51631e2-8bb9-4f43-958a-a3475d800d61-kube-api-access-vj44t\") pod \"e51631e2-8bb9-4f43-958a-a3475d800d61\" (UID: \"e51631e2-8bb9-4f43-958a-a3475d800d61\") " Oct 01 13:40:59 crc kubenswrapper[4749]: I1001 13:40:59.969663 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51631e2-8bb9-4f43-958a-a3475d800d61-kube-api-access-vj44t" (OuterVolumeSpecName: "kube-api-access-vj44t") pod "e51631e2-8bb9-4f43-958a-a3475d800d61" (UID: "e51631e2-8bb9-4f43-958a-a3475d800d61"). InnerVolumeSpecName "kube-api-access-vj44t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:40:59 crc kubenswrapper[4749]: I1001 13:40:59.987415 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51631e2-8bb9-4f43-958a-a3475d800d61-inventory" (OuterVolumeSpecName: "inventory") pod "e51631e2-8bb9-4f43-958a-a3475d800d61" (UID: "e51631e2-8bb9-4f43-958a-a3475d800d61"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:40:59 crc kubenswrapper[4749]: I1001 13:40:59.995696 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e51631e2-8bb9-4f43-958a-a3475d800d61-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e51631e2-8bb9-4f43-958a-a3475d800d61" (UID: "e51631e2-8bb9-4f43-958a-a3475d800d61"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.061824 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e51631e2-8bb9-4f43-958a-a3475d800d61-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.061855 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e51631e2-8bb9-4f43-958a-a3475d800d61-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.061865 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj44t\" (UniqueName: \"kubernetes.io/projected/e51631e2-8bb9-4f43-958a-a3475d800d61-kube-api-access-vj44t\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.282107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" event={"ID":"e51631e2-8bb9-4f43-958a-a3475d800d61","Type":"ContainerDied","Data":"f95b5ed05ce2ba4d9eee76405d83c4f73911bba6c7e3b114c2e124ef4fd74eb1"} Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.282151 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f95b5ed05ce2ba4d9eee76405d83c4f73911bba6c7e3b114c2e124ef4fd74eb1" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.282279 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-s59r6" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.285142 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9e9378-b175-4b01-af8a-4194e5fb7cd0" containerID="6f1c72483a13a6e9749b27b937315ce65da13ea00906cf4be660a155bfab2f86" exitCode=0 Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.285207 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg4vv" event={"ID":"8b9e9378-b175-4b01-af8a-4194e5fb7cd0","Type":"ContainerDied","Data":"6f1c72483a13a6e9749b27b937315ce65da13ea00906cf4be660a155bfab2f86"} Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.391171 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt"] Oct 01 13:41:00 crc kubenswrapper[4749]: E1001 13:41:00.391669 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e51631e2-8bb9-4f43-958a-a3475d800d61" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.391690 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51631e2-8bb9-4f43-958a-a3475d800d61" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.391891 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="e51631e2-8bb9-4f43-958a-a3475d800d61" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.392749 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.395136 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.395179 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.395345 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.395430 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.425952 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt"] Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.574424 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsz8j\" (UniqueName: \"kubernetes.io/projected/9b4bd5b0-38c2-416c-aba3-9a0522807502-kube-api-access-dsz8j\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt\" (UID: \"9b4bd5b0-38c2-416c-aba3-9a0522807502\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.574804 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b4bd5b0-38c2-416c-aba3-9a0522807502-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt\" (UID: \"9b4bd5b0-38c2-416c-aba3-9a0522807502\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.575196 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b4bd5b0-38c2-416c-aba3-9a0522807502-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt\" (UID: \"9b4bd5b0-38c2-416c-aba3-9a0522807502\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.677769 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b4bd5b0-38c2-416c-aba3-9a0522807502-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt\" (UID: \"9b4bd5b0-38c2-416c-aba3-9a0522807502\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.678185 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b4bd5b0-38c2-416c-aba3-9a0522807502-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt\" (UID: \"9b4bd5b0-38c2-416c-aba3-9a0522807502\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.678421 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsz8j\" (UniqueName: \"kubernetes.io/projected/9b4bd5b0-38c2-416c-aba3-9a0522807502-kube-api-access-dsz8j\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt\" (UID: \"9b4bd5b0-38c2-416c-aba3-9a0522807502\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.684566 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b4bd5b0-38c2-416c-aba3-9a0522807502-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt\" (UID: \"9b4bd5b0-38c2-416c-aba3-9a0522807502\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.685252 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b4bd5b0-38c2-416c-aba3-9a0522807502-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt\" (UID: \"9b4bd5b0-38c2-416c-aba3-9a0522807502\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.696620 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsz8j\" (UniqueName: \"kubernetes.io/projected/9b4bd5b0-38c2-416c-aba3-9a0522807502-kube-api-access-dsz8j\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt\" (UID: \"9b4bd5b0-38c2-416c-aba3-9a0522807502\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:00 crc kubenswrapper[4749]: I1001 13:41:00.708574 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:01 crc kubenswrapper[4749]: I1001 13:41:01.295062 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg4vv" event={"ID":"8b9e9378-b175-4b01-af8a-4194e5fb7cd0","Type":"ContainerStarted","Data":"a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34"} Oct 01 13:41:01 crc kubenswrapper[4749]: I1001 13:41:01.309818 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt"] Oct 01 13:41:01 crc kubenswrapper[4749]: W1001 13:41:01.315427 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b4bd5b0_38c2_416c_aba3_9a0522807502.slice/crio-b7ebbaddb04380b2e0c496720088b13c26f7a2324a7eec61c14f559651b3bf7c WatchSource:0}: Error finding container b7ebbaddb04380b2e0c496720088b13c26f7a2324a7eec61c14f559651b3bf7c: Status 404 returned error can't find the container with id b7ebbaddb04380b2e0c496720088b13c26f7a2324a7eec61c14f559651b3bf7c Oct 01 13:41:01 crc kubenswrapper[4749]: I1001 13:41:01.319524 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pg4vv" podStartSLOduration=2.589588165 podStartE2EDuration="5.319504944s" podCreationTimestamp="2025-10-01 13:40:56 +0000 UTC" firstStartedPulling="2025-10-01 13:40:58.255540386 +0000 UTC m=+2118.309525325" lastFinishedPulling="2025-10-01 13:41:00.985457185 +0000 UTC m=+2121.039442104" observedRunningTime="2025-10-01 13:41:01.315922341 +0000 UTC m=+2121.369907240" watchObservedRunningTime="2025-10-01 13:41:01.319504944 +0000 UTC m=+2121.373489853" Oct 01 13:41:02 crc kubenswrapper[4749]: I1001 13:41:02.306383 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" event={"ID":"9b4bd5b0-38c2-416c-aba3-9a0522807502","Type":"ContainerStarted","Data":"c4cfaebdd5bd14079efdbb3027c8d981bed7e0ba5fe917a8c4f1058ed64fb63f"} Oct 01 13:41:02 crc kubenswrapper[4749]: I1001 13:41:02.306749 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" event={"ID":"9b4bd5b0-38c2-416c-aba3-9a0522807502","Type":"ContainerStarted","Data":"b7ebbaddb04380b2e0c496720088b13c26f7a2324a7eec61c14f559651b3bf7c"} Oct 01 13:41:02 crc kubenswrapper[4749]: I1001 13:41:02.355586 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" podStartSLOduration=1.899203801 podStartE2EDuration="2.355550838s" podCreationTimestamp="2025-10-01 13:41:00 +0000 UTC" firstStartedPulling="2025-10-01 13:41:01.31969628 +0000 UTC m=+2121.373681179" lastFinishedPulling="2025-10-01 13:41:01.776043317 +0000 UTC m=+2121.830028216" observedRunningTime="2025-10-01 13:41:02.331129215 +0000 UTC m=+2122.385114154" watchObservedRunningTime="2025-10-01 13:41:02.355550838 +0000 UTC m=+2122.409535797" Oct 01 13:41:06 crc kubenswrapper[4749]: I1001 13:41:06.996556 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:41:06 crc kubenswrapper[4749]: I1001 13:41:06.997194 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:41:07 crc kubenswrapper[4749]: I1001 13:41:07.075395 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:41:07 crc kubenswrapper[4749]: I1001 13:41:07.442819 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:41:07 crc kubenswrapper[4749]: I1001 13:41:07.521501 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pg4vv"] Oct 01 13:41:09 crc kubenswrapper[4749]: I1001 13:41:09.380536 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pg4vv" podUID="8b9e9378-b175-4b01-af8a-4194e5fb7cd0" containerName="registry-server" containerID="cri-o://a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34" gracePeriod=2 Oct 01 13:41:09 crc kubenswrapper[4749]: I1001 13:41:09.976949 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:41:09 crc kubenswrapper[4749]: I1001 13:41:09.989111 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-utilities\") pod \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\" (UID: \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\") " Oct 01 13:41:09 crc kubenswrapper[4749]: I1001 13:41:09.989285 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-catalog-content\") pod \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\" (UID: \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\") " Oct 01 13:41:09 crc kubenswrapper[4749]: I1001 13:41:09.989409 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsrcb\" (UniqueName: \"kubernetes.io/projected/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-kube-api-access-xsrcb\") pod \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\" (UID: \"8b9e9378-b175-4b01-af8a-4194e5fb7cd0\") " Oct 01 13:41:09 crc kubenswrapper[4749]: I1001 13:41:09.990536 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-utilities" (OuterVolumeSpecName: "utilities") pod "8b9e9378-b175-4b01-af8a-4194e5fb7cd0" (UID: "8b9e9378-b175-4b01-af8a-4194e5fb7cd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:41:09 crc kubenswrapper[4749]: I1001 13:41:09.991666 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.006078 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-kube-api-access-xsrcb" (OuterVolumeSpecName: "kube-api-access-xsrcb") pod "8b9e9378-b175-4b01-af8a-4194e5fb7cd0" (UID: "8b9e9378-b175-4b01-af8a-4194e5fb7cd0"). InnerVolumeSpecName "kube-api-access-xsrcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.094790 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsrcb\" (UniqueName: \"kubernetes.io/projected/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-kube-api-access-xsrcb\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.295540 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b9e9378-b175-4b01-af8a-4194e5fb7cd0" (UID: "8b9e9378-b175-4b01-af8a-4194e5fb7cd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.298258 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b9e9378-b175-4b01-af8a-4194e5fb7cd0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.392428 4749 generic.go:334] "Generic (PLEG): container finished" podID="8b9e9378-b175-4b01-af8a-4194e5fb7cd0" containerID="a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34" exitCode=0 Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.392514 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pg4vv" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.392540 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg4vv" event={"ID":"8b9e9378-b175-4b01-af8a-4194e5fb7cd0","Type":"ContainerDied","Data":"a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34"} Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.395244 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pg4vv" event={"ID":"8b9e9378-b175-4b01-af8a-4194e5fb7cd0","Type":"ContainerDied","Data":"ade3bb095d11c63f13afaa0f2b159ea940a2631a4b9dc0a63f2a6658d8121b48"} Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.395321 4749 scope.go:117] "RemoveContainer" containerID="a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.446602 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pg4vv"] Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.447757 4749 scope.go:117] "RemoveContainer" containerID="6f1c72483a13a6e9749b27b937315ce65da13ea00906cf4be660a155bfab2f86" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.455492 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pg4vv"] Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.475790 4749 scope.go:117] "RemoveContainer" containerID="b9a12b956b7a7cd54018a51702248f47221452f28e5d521f0ca161e451ec6826" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.519201 4749 scope.go:117] "RemoveContainer" containerID="a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34" Oct 01 13:41:10 crc kubenswrapper[4749]: E1001 13:41:10.520422 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34\": container with ID starting with a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34 not found: ID does not exist" containerID="a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.520474 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34"} err="failed to get container status \"a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34\": rpc error: code = NotFound desc = could not find container \"a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34\": container with ID starting with a5694aaed820532dbea71af38379db1457ef546c3b370ec38db67f4bca824f34 not found: ID does not exist" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.520505 4749 scope.go:117] "RemoveContainer" containerID="6f1c72483a13a6e9749b27b937315ce65da13ea00906cf4be660a155bfab2f86" Oct 01 13:41:10 crc kubenswrapper[4749]: E1001 13:41:10.521029 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f1c72483a13a6e9749b27b937315ce65da13ea00906cf4be660a155bfab2f86\": container with ID starting with 6f1c72483a13a6e9749b27b937315ce65da13ea00906cf4be660a155bfab2f86 not found: ID does not exist" containerID="6f1c72483a13a6e9749b27b937315ce65da13ea00906cf4be660a155bfab2f86" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.521059 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f1c72483a13a6e9749b27b937315ce65da13ea00906cf4be660a155bfab2f86"} err="failed to get container status \"6f1c72483a13a6e9749b27b937315ce65da13ea00906cf4be660a155bfab2f86\": rpc error: code = NotFound desc = could not find container \"6f1c72483a13a6e9749b27b937315ce65da13ea00906cf4be660a155bfab2f86\": container with ID starting with 6f1c72483a13a6e9749b27b937315ce65da13ea00906cf4be660a155bfab2f86 not found: ID does not exist" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.521078 4749 scope.go:117] "RemoveContainer" containerID="b9a12b956b7a7cd54018a51702248f47221452f28e5d521f0ca161e451ec6826" Oct 01 13:41:10 crc kubenswrapper[4749]: E1001 13:41:10.522290 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9a12b956b7a7cd54018a51702248f47221452f28e5d521f0ca161e451ec6826\": container with ID starting with b9a12b956b7a7cd54018a51702248f47221452f28e5d521f0ca161e451ec6826 not found: ID does not exist" containerID="b9a12b956b7a7cd54018a51702248f47221452f28e5d521f0ca161e451ec6826" Oct 01 13:41:10 crc kubenswrapper[4749]: I1001 13:41:10.522338 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a12b956b7a7cd54018a51702248f47221452f28e5d521f0ca161e451ec6826"} err="failed to get container status \"b9a12b956b7a7cd54018a51702248f47221452f28e5d521f0ca161e451ec6826\": rpc error: code = NotFound desc = could not find container \"b9a12b956b7a7cd54018a51702248f47221452f28e5d521f0ca161e451ec6826\": container with ID starting with b9a12b956b7a7cd54018a51702248f47221452f28e5d521f0ca161e451ec6826 not found: ID does not exist" Oct 01 13:41:11 crc kubenswrapper[4749]: I1001 13:41:11.247660 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9e9378-b175-4b01-af8a-4194e5fb7cd0" path="/var/lib/kubelet/pods/8b9e9378-b175-4b01-af8a-4194e5fb7cd0/volumes" Oct 01 13:41:12 crc kubenswrapper[4749]: I1001 13:41:12.425075 4749 generic.go:334] "Generic (PLEG): container finished" podID="9b4bd5b0-38c2-416c-aba3-9a0522807502" containerID="c4cfaebdd5bd14079efdbb3027c8d981bed7e0ba5fe917a8c4f1058ed64fb63f" exitCode=0 Oct 01 13:41:12 crc kubenswrapper[4749]: I1001 13:41:12.425125 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" event={"ID":"9b4bd5b0-38c2-416c-aba3-9a0522807502","Type":"ContainerDied","Data":"c4cfaebdd5bd14079efdbb3027c8d981bed7e0ba5fe917a8c4f1058ed64fb63f"} Oct 01 13:41:13 crc kubenswrapper[4749]: I1001 13:41:13.903376 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.073653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsz8j\" (UniqueName: \"kubernetes.io/projected/9b4bd5b0-38c2-416c-aba3-9a0522807502-kube-api-access-dsz8j\") pod \"9b4bd5b0-38c2-416c-aba3-9a0522807502\" (UID: \"9b4bd5b0-38c2-416c-aba3-9a0522807502\") " Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.073826 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b4bd5b0-38c2-416c-aba3-9a0522807502-ssh-key\") pod \"9b4bd5b0-38c2-416c-aba3-9a0522807502\" (UID: \"9b4bd5b0-38c2-416c-aba3-9a0522807502\") " Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.073932 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b4bd5b0-38c2-416c-aba3-9a0522807502-inventory\") pod \"9b4bd5b0-38c2-416c-aba3-9a0522807502\" (UID: \"9b4bd5b0-38c2-416c-aba3-9a0522807502\") " Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.079513 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4bd5b0-38c2-416c-aba3-9a0522807502-kube-api-access-dsz8j" (OuterVolumeSpecName: "kube-api-access-dsz8j") pod "9b4bd5b0-38c2-416c-aba3-9a0522807502" (UID: "9b4bd5b0-38c2-416c-aba3-9a0522807502"). InnerVolumeSpecName "kube-api-access-dsz8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.100596 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4bd5b0-38c2-416c-aba3-9a0522807502-inventory" (OuterVolumeSpecName: "inventory") pod "9b4bd5b0-38c2-416c-aba3-9a0522807502" (UID: "9b4bd5b0-38c2-416c-aba3-9a0522807502"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.126405 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4bd5b0-38c2-416c-aba3-9a0522807502-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b4bd5b0-38c2-416c-aba3-9a0522807502" (UID: "9b4bd5b0-38c2-416c-aba3-9a0522807502"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.175901 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b4bd5b0-38c2-416c-aba3-9a0522807502-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.175935 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b4bd5b0-38c2-416c-aba3-9a0522807502-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.175946 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsz8j\" (UniqueName: \"kubernetes.io/projected/9b4bd5b0-38c2-416c-aba3-9a0522807502-kube-api-access-dsz8j\") on node \"crc\" DevicePath \"\"" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.448349 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" event={"ID":"9b4bd5b0-38c2-416c-aba3-9a0522807502","Type":"ContainerDied","Data":"b7ebbaddb04380b2e0c496720088b13c26f7a2324a7eec61c14f559651b3bf7c"} Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.448409 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7ebbaddb04380b2e0c496720088b13c26f7a2324a7eec61c14f559651b3bf7c" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.448490 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.573624 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8"] Oct 01 13:41:14 crc kubenswrapper[4749]: E1001 13:41:14.574309 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9e9378-b175-4b01-af8a-4194e5fb7cd0" containerName="extract-utilities" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.574335 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9e9378-b175-4b01-af8a-4194e5fb7cd0" containerName="extract-utilities" Oct 01 13:41:14 crc kubenswrapper[4749]: E1001 13:41:14.574366 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9e9378-b175-4b01-af8a-4194e5fb7cd0" containerName="extract-content" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.574376 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9e9378-b175-4b01-af8a-4194e5fb7cd0" containerName="extract-content" Oct 01 13:41:14 crc kubenswrapper[4749]: E1001 13:41:14.574395 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9e9378-b175-4b01-af8a-4194e5fb7cd0" containerName="registry-server" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.574405 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9e9378-b175-4b01-af8a-4194e5fb7cd0" containerName="registry-server" Oct 01 13:41:14 crc kubenswrapper[4749]: E1001 13:41:14.574429 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4bd5b0-38c2-416c-aba3-9a0522807502" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.574442 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4bd5b0-38c2-416c-aba3-9a0522807502" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.574744 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4bd5b0-38c2-416c-aba3-9a0522807502" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.574799 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9e9378-b175-4b01-af8a-4194e5fb7cd0" containerName="registry-server" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.575993 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.577805 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.579890 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.580082 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.580250 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.580440 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.580535 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.580592 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.581565 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.585000 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.585087 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.585184 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.585384 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.585491 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.585589 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.585806 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.585899 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.585983 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9gdz\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-kube-api-access-l9gdz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.586087 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.586192 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.586244 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8"] Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.586351 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.586430 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.586517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.687773 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.687814 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.687866 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9gdz\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-kube-api-access-l9gdz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.687895 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.687953 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.688002 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.688026 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.688058 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.688110 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.688145 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.688206 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.688250 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.688270 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.688295 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.692559 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.693629 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.694035 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.695086 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.695133 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.695997 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.696084 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.696334 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.698324 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.698496 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.699563 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.700890 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.707765 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.707800 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9gdz\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-kube-api-access-l9gdz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:14 crc kubenswrapper[4749]: I1001 13:41:14.927554 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:41:15 crc kubenswrapper[4749]: I1001 13:41:15.494851 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8"] Oct 01 13:41:16 crc kubenswrapper[4749]: I1001 13:41:16.469075 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" event={"ID":"c51108bd-9132-43bf-ac9b-61a8284dc289","Type":"ContainerStarted","Data":"0e3444da70dccbee05b1a65cb693019ada70449f1b40b50a5b31ffa6759c0fa6"} Oct 01 13:41:16 crc kubenswrapper[4749]: I1001 13:41:16.469524 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" event={"ID":"c51108bd-9132-43bf-ac9b-61a8284dc289","Type":"ContainerStarted","Data":"160041932a23c14f80bfa11c0f00f0587c45992388fd3bd9d593f206ca6a742e"} Oct 01 13:41:16 crc kubenswrapper[4749]: I1001 13:41:16.496390 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" podStartSLOduration=1.9624215820000002 podStartE2EDuration="2.496365402s" podCreationTimestamp="2025-10-01 13:41:14 +0000 UTC" firstStartedPulling="2025-10-01 13:41:15.5052307 +0000 UTC m=+2135.559215599" lastFinishedPulling="2025-10-01 13:41:16.03917449 +0000 UTC m=+2136.093159419" observedRunningTime="2025-10-01 13:41:16.490428201 +0000 UTC m=+2136.544413110" watchObservedRunningTime="2025-10-01 13:41:16.496365402 +0000 UTC m=+2136.550350331" Oct 01 13:41:59 crc kubenswrapper[4749]: I1001 13:41:59.957810 4749 generic.go:334] "Generic (PLEG): container finished" podID="c51108bd-9132-43bf-ac9b-61a8284dc289" containerID="0e3444da70dccbee05b1a65cb693019ada70449f1b40b50a5b31ffa6759c0fa6" exitCode=0 Oct 01 13:41:59 crc kubenswrapper[4749]: I1001 13:41:59.957874 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" event={"ID":"c51108bd-9132-43bf-ac9b-61a8284dc289","Type":"ContainerDied","Data":"0e3444da70dccbee05b1a65cb693019ada70449f1b40b50a5b31ffa6759c0fa6"} Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.362076 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419302 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419363 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-neutron-metadata-combined-ca-bundle\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419428 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-nova-combined-ca-bundle\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419458 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-ovn-default-certs-0\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419481 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-libvirt-combined-ca-bundle\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419519 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-ssh-key\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419577 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-inventory\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419598 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-repo-setup-combined-ca-bundle\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419630 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419670 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-bootstrap-combined-ca-bundle\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419717 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-ovn-combined-ca-bundle\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419784 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-telemetry-combined-ca-bundle\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419822 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9gdz\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-kube-api-access-l9gdz\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.419942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"c51108bd-9132-43bf-ac9b-61a8284dc289\" (UID: \"c51108bd-9132-43bf-ac9b-61a8284dc289\") " Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.426421 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.427090 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.429090 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.430210 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.430363 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.432612 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.432742 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.433378 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.434946 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.435048 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.435906 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-kube-api-access-l9gdz" (OuterVolumeSpecName: "kube-api-access-l9gdz") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "kube-api-access-l9gdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.436467 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.459826 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-inventory" (OuterVolumeSpecName: "inventory") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.461960 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c51108bd-9132-43bf-ac9b-61a8284dc289" (UID: "c51108bd-9132-43bf-ac9b-61a8284dc289"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522532 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522569 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522581 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522594 4749 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522605 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522619 4749 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522630 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522639 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522648 4749 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522658 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522671 4749 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522681 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522689 4749 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c51108bd-9132-43bf-ac9b-61a8284dc289-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.522698 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9gdz\" (UniqueName: \"kubernetes.io/projected/c51108bd-9132-43bf-ac9b-61a8284dc289-kube-api-access-l9gdz\") on node \"crc\" DevicePath \"\"" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.979829 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" event={"ID":"c51108bd-9132-43bf-ac9b-61a8284dc289","Type":"ContainerDied","Data":"160041932a23c14f80bfa11c0f00f0587c45992388fd3bd9d593f206ca6a742e"} Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.979866 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="160041932a23c14f80bfa11c0f00f0587c45992388fd3bd9d593f206ca6a742e" Oct 01 13:42:01 crc kubenswrapper[4749]: I1001 13:42:01.979941 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.107008 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.107360 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.190793 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd"] Oct 01 13:42:02 crc kubenswrapper[4749]: E1001 13:42:02.191344 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c51108bd-9132-43bf-ac9b-61a8284dc289" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.191367 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c51108bd-9132-43bf-ac9b-61a8284dc289" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.191645 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c51108bd-9132-43bf-ac9b-61a8284dc289" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.192598 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.195205 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.195524 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.195911 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.196378 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.203484 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd"] Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.204706 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.342678 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plkb2\" (UniqueName: \"kubernetes.io/projected/4b848b5a-f3c5-438c-a481-f06d07d4273a-kube-api-access-plkb2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.343088 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4b848b5a-f3c5-438c-a481-f06d07d4273a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.343250 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.343271 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.343316 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.447309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.447385 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.447441 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.447666 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plkb2\" (UniqueName: \"kubernetes.io/projected/4b848b5a-f3c5-438c-a481-f06d07d4273a-kube-api-access-plkb2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.447829 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4b848b5a-f3c5-438c-a481-f06d07d4273a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.449920 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4b848b5a-f3c5-438c-a481-f06d07d4273a-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.454714 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.456410 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.460076 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.476101 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plkb2\" (UniqueName: \"kubernetes.io/projected/4b848b5a-f3c5-438c-a481-f06d07d4273a-kube-api-access-plkb2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-x8tkd\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:02 crc kubenswrapper[4749]: I1001 13:42:02.513714 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:42:03 crc kubenswrapper[4749]: I1001 13:42:03.141593 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd"] Oct 01 13:42:04 crc kubenswrapper[4749]: I1001 13:42:04.002241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" event={"ID":"4b848b5a-f3c5-438c-a481-f06d07d4273a","Type":"ContainerStarted","Data":"ac15c751774307bb1b71cfcd6188cb64ec0939bfaf5ee2549880b9ff3fb80e07"} Oct 01 13:42:05 crc kubenswrapper[4749]: I1001 13:42:05.013629 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" event={"ID":"4b848b5a-f3c5-438c-a481-f06d07d4273a","Type":"ContainerStarted","Data":"4eedc43331887b1325768637b4c31ca18e7e5355f3d5af57d887225d2a693321"} Oct 01 13:42:05 crc kubenswrapper[4749]: I1001 13:42:05.048716 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" podStartSLOduration=2.448242096 podStartE2EDuration="3.048690877s" podCreationTimestamp="2025-10-01 13:42:02 +0000 UTC" firstStartedPulling="2025-10-01 13:42:03.148518647 +0000 UTC m=+2183.202503546" lastFinishedPulling="2025-10-01 13:42:03.748967418 +0000 UTC m=+2183.802952327" observedRunningTime="2025-10-01 13:42:05.034099527 +0000 UTC m=+2185.088084436" watchObservedRunningTime="2025-10-01 13:42:05.048690877 +0000 UTC m=+2185.102675816" Oct 01 13:42:32 crc kubenswrapper[4749]: I1001 13:42:32.107347 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:42:32 crc kubenswrapper[4749]: I1001 13:42:32.108185 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:43:02 crc kubenswrapper[4749]: I1001 13:43:02.106672 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:43:02 crc kubenswrapper[4749]: I1001 13:43:02.107307 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:43:02 crc kubenswrapper[4749]: I1001 13:43:02.107369 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:43:02 crc kubenswrapper[4749]: I1001 13:43:02.108254 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:43:02 crc kubenswrapper[4749]: I1001 13:43:02.108325 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" gracePeriod=600 Oct 01 13:43:02 crc kubenswrapper[4749]: E1001 13:43:02.230673 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:43:02 crc kubenswrapper[4749]: I1001 13:43:02.592610 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" exitCode=0 Oct 01 13:43:02 crc kubenswrapper[4749]: I1001 13:43:02.592683 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f"} Oct 01 13:43:02 crc kubenswrapper[4749]: I1001 13:43:02.592944 4749 scope.go:117] "RemoveContainer" containerID="b8204bdc15b502545a9526a4a541f27a52efd45ad6646c2c13cfdd5b53e3e274" Oct 01 13:43:02 crc kubenswrapper[4749]: I1001 13:43:02.593753 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:43:02 crc kubenswrapper[4749]: E1001 13:43:02.594109 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:43:14 crc kubenswrapper[4749]: I1001 13:43:14.720644 4749 generic.go:334] "Generic (PLEG): container finished" podID="4b848b5a-f3c5-438c-a481-f06d07d4273a" containerID="4eedc43331887b1325768637b4c31ca18e7e5355f3d5af57d887225d2a693321" exitCode=0 Oct 01 13:43:14 crc kubenswrapper[4749]: I1001 13:43:14.720778 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" event={"ID":"4b848b5a-f3c5-438c-a481-f06d07d4273a","Type":"ContainerDied","Data":"4eedc43331887b1325768637b4c31ca18e7e5355f3d5af57d887225d2a693321"} Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.250632 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.374479 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plkb2\" (UniqueName: \"kubernetes.io/projected/4b848b5a-f3c5-438c-a481-f06d07d4273a-kube-api-access-plkb2\") pod \"4b848b5a-f3c5-438c-a481-f06d07d4273a\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.375173 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-ssh-key\") pod \"4b848b5a-f3c5-438c-a481-f06d07d4273a\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.375272 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-inventory\") pod \"4b848b5a-f3c5-438c-a481-f06d07d4273a\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.375341 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-ovn-combined-ca-bundle\") pod \"4b848b5a-f3c5-438c-a481-f06d07d4273a\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.375440 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4b848b5a-f3c5-438c-a481-f06d07d4273a-ovncontroller-config-0\") pod \"4b848b5a-f3c5-438c-a481-f06d07d4273a\" (UID: \"4b848b5a-f3c5-438c-a481-f06d07d4273a\") " Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.381305 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4b848b5a-f3c5-438c-a481-f06d07d4273a" (UID: "4b848b5a-f3c5-438c-a481-f06d07d4273a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.382004 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b848b5a-f3c5-438c-a481-f06d07d4273a-kube-api-access-plkb2" (OuterVolumeSpecName: "kube-api-access-plkb2") pod "4b848b5a-f3c5-438c-a481-f06d07d4273a" (UID: "4b848b5a-f3c5-438c-a481-f06d07d4273a"). InnerVolumeSpecName "kube-api-access-plkb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.414762 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-inventory" (OuterVolumeSpecName: "inventory") pod "4b848b5a-f3c5-438c-a481-f06d07d4273a" (UID: "4b848b5a-f3c5-438c-a481-f06d07d4273a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.424422 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4b848b5a-f3c5-438c-a481-f06d07d4273a" (UID: "4b848b5a-f3c5-438c-a481-f06d07d4273a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.432820 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b848b5a-f3c5-438c-a481-f06d07d4273a-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "4b848b5a-f3c5-438c-a481-f06d07d4273a" (UID: "4b848b5a-f3c5-438c-a481-f06d07d4273a"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.479130 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plkb2\" (UniqueName: \"kubernetes.io/projected/4b848b5a-f3c5-438c-a481-f06d07d4273a-kube-api-access-plkb2\") on node \"crc\" DevicePath \"\"" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.479499 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.479638 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.479748 4749 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b848b5a-f3c5-438c-a481-f06d07d4273a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.479873 4749 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/4b848b5a-f3c5-438c-a481-f06d07d4273a-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.744500 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" event={"ID":"4b848b5a-f3c5-438c-a481-f06d07d4273a","Type":"ContainerDied","Data":"ac15c751774307bb1b71cfcd6188cb64ec0939bfaf5ee2549880b9ff3fb80e07"} Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.744821 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac15c751774307bb1b71cfcd6188cb64ec0939bfaf5ee2549880b9ff3fb80e07" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.744558 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-x8tkd" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.855002 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528"] Oct 01 13:43:16 crc kubenswrapper[4749]: E1001 13:43:16.855569 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b848b5a-f3c5-438c-a481-f06d07d4273a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.855592 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b848b5a-f3c5-438c-a481-f06d07d4273a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.855796 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b848b5a-f3c5-438c-a481-f06d07d4273a" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.856718 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.859865 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.859974 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.860041 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.860532 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.860730 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.860888 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.869566 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528"] Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.989612 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.990140 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.990389 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.990568 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq7fs\" (UniqueName: \"kubernetes.io/projected/9f01f729-fe3b-4f70-89c9-4398f80160e7-kube-api-access-cq7fs\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.990699 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:16 crc kubenswrapper[4749]: I1001 13:43:16.991309 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.111240 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.111294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.111340 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq7fs\" (UniqueName: \"kubernetes.io/projected/9f01f729-fe3b-4f70-89c9-4398f80160e7-kube-api-access-cq7fs\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.111370 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.111437 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.111484 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.119263 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.120424 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.120587 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.122566 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.122607 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.131769 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq7fs\" (UniqueName: \"kubernetes.io/projected/9f01f729-fe3b-4f70-89c9-4398f80160e7-kube-api-access-cq7fs\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.187538 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.231673 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:43:17 crc kubenswrapper[4749]: E1001 13:43:17.232358 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.801721 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528"] Oct 01 13:43:17 crc kubenswrapper[4749]: I1001 13:43:17.808636 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:43:18 crc kubenswrapper[4749]: I1001 13:43:18.768763 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" event={"ID":"9f01f729-fe3b-4f70-89c9-4398f80160e7","Type":"ContainerStarted","Data":"4b332e9f3645e694f40bc8fedd6f965395f1b061dcb689a32a30cd006e7a381f"} Oct 01 13:43:19 crc kubenswrapper[4749]: I1001 13:43:19.783672 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" event={"ID":"9f01f729-fe3b-4f70-89c9-4398f80160e7","Type":"ContainerStarted","Data":"a1cba802284992fb7f4e6cabb1666f4c69ffe3375bef5d263bf307bfb525a085"} Oct 01 13:43:19 crc kubenswrapper[4749]: I1001 13:43:19.823287 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" podStartSLOduration=3.050796897 podStartE2EDuration="3.823263188s" podCreationTimestamp="2025-10-01 13:43:16 +0000 UTC" firstStartedPulling="2025-10-01 13:43:17.808350059 +0000 UTC m=+2257.862334968" lastFinishedPulling="2025-10-01 13:43:18.58081633 +0000 UTC m=+2258.634801259" observedRunningTime="2025-10-01 13:43:19.814400232 +0000 UTC m=+2259.868385161" watchObservedRunningTime="2025-10-01 13:43:19.823263188 +0000 UTC m=+2259.877248127" Oct 01 13:43:29 crc kubenswrapper[4749]: I1001 13:43:29.230837 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:43:29 crc kubenswrapper[4749]: E1001 13:43:29.231922 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:43:43 crc kubenswrapper[4749]: I1001 13:43:43.230526 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:43:43 crc kubenswrapper[4749]: E1001 13:43:43.231556 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:43:57 crc kubenswrapper[4749]: I1001 13:43:57.230049 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:43:57 crc kubenswrapper[4749]: E1001 13:43:57.230953 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:44:12 crc kubenswrapper[4749]: I1001 13:44:12.230893 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:44:12 crc kubenswrapper[4749]: E1001 13:44:12.232303 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:44:13 crc kubenswrapper[4749]: I1001 13:44:13.361093 4749 generic.go:334] "Generic (PLEG): container finished" podID="9f01f729-fe3b-4f70-89c9-4398f80160e7" containerID="a1cba802284992fb7f4e6cabb1666f4c69ffe3375bef5d263bf307bfb525a085" exitCode=0 Oct 01 13:44:13 crc kubenswrapper[4749]: I1001 13:44:13.361240 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" event={"ID":"9f01f729-fe3b-4f70-89c9-4398f80160e7","Type":"ContainerDied","Data":"a1cba802284992fb7f4e6cabb1666f4c69ffe3375bef5d263bf307bfb525a085"} Oct 01 13:44:14 crc kubenswrapper[4749]: I1001 13:44:14.852997 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.017379 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-nova-metadata-neutron-config-0\") pod \"9f01f729-fe3b-4f70-89c9-4398f80160e7\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.017478 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-inventory\") pod \"9f01f729-fe3b-4f70-89c9-4398f80160e7\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.017542 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-ssh-key\") pod \"9f01f729-fe3b-4f70-89c9-4398f80160e7\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.017610 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq7fs\" (UniqueName: \"kubernetes.io/projected/9f01f729-fe3b-4f70-89c9-4398f80160e7-kube-api-access-cq7fs\") pod \"9f01f729-fe3b-4f70-89c9-4398f80160e7\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.017649 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9f01f729-fe3b-4f70-89c9-4398f80160e7\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.017738 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-neutron-metadata-combined-ca-bundle\") pod \"9f01f729-fe3b-4f70-89c9-4398f80160e7\" (UID: \"9f01f729-fe3b-4f70-89c9-4398f80160e7\") " Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.023873 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f01f729-fe3b-4f70-89c9-4398f80160e7-kube-api-access-cq7fs" (OuterVolumeSpecName: "kube-api-access-cq7fs") pod "9f01f729-fe3b-4f70-89c9-4398f80160e7" (UID: "9f01f729-fe3b-4f70-89c9-4398f80160e7"). InnerVolumeSpecName "kube-api-access-cq7fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.024868 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9f01f729-fe3b-4f70-89c9-4398f80160e7" (UID: "9f01f729-fe3b-4f70-89c9-4398f80160e7"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.049743 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-inventory" (OuterVolumeSpecName: "inventory") pod "9f01f729-fe3b-4f70-89c9-4398f80160e7" (UID: "9f01f729-fe3b-4f70-89c9-4398f80160e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.052530 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9f01f729-fe3b-4f70-89c9-4398f80160e7" (UID: "9f01f729-fe3b-4f70-89c9-4398f80160e7"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.057206 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9f01f729-fe3b-4f70-89c9-4398f80160e7" (UID: "9f01f729-fe3b-4f70-89c9-4398f80160e7"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.081164 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9f01f729-fe3b-4f70-89c9-4398f80160e7" (UID: "9f01f729-fe3b-4f70-89c9-4398f80160e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.120410 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.120463 4749 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.120569 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.120590 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.120608 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq7fs\" (UniqueName: \"kubernetes.io/projected/9f01f729-fe3b-4f70-89c9-4398f80160e7-kube-api-access-cq7fs\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.120625 4749 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9f01f729-fe3b-4f70-89c9-4398f80160e7-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.385209 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" event={"ID":"9f01f729-fe3b-4f70-89c9-4398f80160e7","Type":"ContainerDied","Data":"4b332e9f3645e694f40bc8fedd6f965395f1b061dcb689a32a30cd006e7a381f"} Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.385263 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b332e9f3645e694f40bc8fedd6f965395f1b061dcb689a32a30cd006e7a381f" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.385292 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.485696 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6"] Oct 01 13:44:15 crc kubenswrapper[4749]: E1001 13:44:15.486100 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f01f729-fe3b-4f70-89c9-4398f80160e7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.486116 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f01f729-fe3b-4f70-89c9-4398f80160e7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.486373 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f01f729-fe3b-4f70-89c9-4398f80160e7" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.487021 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.492821 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.493047 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.493274 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.493429 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.493647 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.500622 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6"] Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.529441 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r2vv\" (UniqueName: \"kubernetes.io/projected/f2871c6b-b170-4396-8c0b-be0ac02c1b48-kube-api-access-4r2vv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.529870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.529917 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.529957 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.530022 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.632258 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r2vv\" (UniqueName: \"kubernetes.io/projected/f2871c6b-b170-4396-8c0b-be0ac02c1b48-kube-api-access-4r2vv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.632463 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.632496 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.632524 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.632565 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.636373 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.637036 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.637198 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.637435 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.649047 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r2vv\" (UniqueName: \"kubernetes.io/projected/f2871c6b-b170-4396-8c0b-be0ac02c1b48-kube-api-access-4r2vv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bksd6\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:15 crc kubenswrapper[4749]: I1001 13:44:15.803048 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:44:16 crc kubenswrapper[4749]: I1001 13:44:16.410970 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6"] Oct 01 13:44:17 crc kubenswrapper[4749]: I1001 13:44:17.409031 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" event={"ID":"f2871c6b-b170-4396-8c0b-be0ac02c1b48","Type":"ContainerStarted","Data":"a3a2d39a82a5834d8300d97d2195aa798622024b6f37ded91c553646297e443c"} Oct 01 13:44:17 crc kubenswrapper[4749]: I1001 13:44:17.409081 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" event={"ID":"f2871c6b-b170-4396-8c0b-be0ac02c1b48","Type":"ContainerStarted","Data":"15081485705df72e136b107468a18132a6e16f050f611c64701e5551d0e58518"} Oct 01 13:44:17 crc kubenswrapper[4749]: I1001 13:44:17.458490 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" podStartSLOduration=1.9494970249999999 podStartE2EDuration="2.458459244s" podCreationTimestamp="2025-10-01 13:44:15 +0000 UTC" firstStartedPulling="2025-10-01 13:44:16.422199348 +0000 UTC m=+2316.476184267" lastFinishedPulling="2025-10-01 13:44:16.931161547 +0000 UTC m=+2316.985146486" observedRunningTime="2025-10-01 13:44:17.448774254 +0000 UTC m=+2317.502759163" watchObservedRunningTime="2025-10-01 13:44:17.458459244 +0000 UTC m=+2317.512444173" Oct 01 13:44:26 crc kubenswrapper[4749]: I1001 13:44:26.230569 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:44:26 crc kubenswrapper[4749]: E1001 13:44:26.231399 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:44:36 crc kubenswrapper[4749]: I1001 13:44:36.901988 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bjwzm"] Oct 01 13:44:36 crc kubenswrapper[4749]: I1001 13:44:36.906491 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:36 crc kubenswrapper[4749]: I1001 13:44:36.940224 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bjwzm"] Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.076181 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4csw\" (UniqueName: \"kubernetes.io/projected/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-kube-api-access-l4csw\") pod \"community-operators-bjwzm\" (UID: \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\") " pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.076407 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-utilities\") pod \"community-operators-bjwzm\" (UID: \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\") " pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.076495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-catalog-content\") pod \"community-operators-bjwzm\" (UID: \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\") " pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.091494 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n58zm"] Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.093964 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.116266 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n58zm"] Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.178539 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4csw\" (UniqueName: \"kubernetes.io/projected/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-kube-api-access-l4csw\") pod \"community-operators-bjwzm\" (UID: \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\") " pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.178664 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-utilities\") pod \"community-operators-bjwzm\" (UID: \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\") " pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.178728 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-catalog-content\") pod \"community-operators-bjwzm\" (UID: \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\") " pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.179291 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-catalog-content\") pod \"community-operators-bjwzm\" (UID: \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\") " pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.179353 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-utilities\") pod \"community-operators-bjwzm\" (UID: \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\") " pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.205328 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4csw\" (UniqueName: \"kubernetes.io/projected/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-kube-api-access-l4csw\") pod \"community-operators-bjwzm\" (UID: \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\") " pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.232885 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.289921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79725\" (UniqueName: \"kubernetes.io/projected/c0d6ca94-68a9-4233-90a0-83998c687856-kube-api-access-79725\") pod \"redhat-marketplace-n58zm\" (UID: \"c0d6ca94-68a9-4233-90a0-83998c687856\") " pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.290107 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d6ca94-68a9-4233-90a0-83998c687856-catalog-content\") pod \"redhat-marketplace-n58zm\" (UID: \"c0d6ca94-68a9-4233-90a0-83998c687856\") " pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.290188 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d6ca94-68a9-4233-90a0-83998c687856-utilities\") pod \"redhat-marketplace-n58zm\" (UID: \"c0d6ca94-68a9-4233-90a0-83998c687856\") " pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.391531 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d6ca94-68a9-4233-90a0-83998c687856-utilities\") pod \"redhat-marketplace-n58zm\" (UID: \"c0d6ca94-68a9-4233-90a0-83998c687856\") " pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.392119 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79725\" (UniqueName: \"kubernetes.io/projected/c0d6ca94-68a9-4233-90a0-83998c687856-kube-api-access-79725\") pod \"redhat-marketplace-n58zm\" (UID: \"c0d6ca94-68a9-4233-90a0-83998c687856\") " pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.392031 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d6ca94-68a9-4233-90a0-83998c687856-utilities\") pod \"redhat-marketplace-n58zm\" (UID: \"c0d6ca94-68a9-4233-90a0-83998c687856\") " pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.392419 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d6ca94-68a9-4233-90a0-83998c687856-catalog-content\") pod \"redhat-marketplace-n58zm\" (UID: \"c0d6ca94-68a9-4233-90a0-83998c687856\") " pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.392652 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d6ca94-68a9-4233-90a0-83998c687856-catalog-content\") pod \"redhat-marketplace-n58zm\" (UID: \"c0d6ca94-68a9-4233-90a0-83998c687856\") " pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.413180 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79725\" (UniqueName: \"kubernetes.io/projected/c0d6ca94-68a9-4233-90a0-83998c687856-kube-api-access-79725\") pod \"redhat-marketplace-n58zm\" (UID: \"c0d6ca94-68a9-4233-90a0-83998c687856\") " pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.417884 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:37 crc kubenswrapper[4749]: I1001 13:44:37.911066 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bjwzm"] Oct 01 13:44:37 crc kubenswrapper[4749]: W1001 13:44:37.917431 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeca929ee_fc6d_4197_a1cd_acb05e10a6ba.slice/crio-4e49fa412762275961024c64edd4c69e424f4b0259b20ff073f08c44a7c93f57 WatchSource:0}: Error finding container 4e49fa412762275961024c64edd4c69e424f4b0259b20ff073f08c44a7c93f57: Status 404 returned error can't find the container with id 4e49fa412762275961024c64edd4c69e424f4b0259b20ff073f08c44a7c93f57 Oct 01 13:44:38 crc kubenswrapper[4749]: I1001 13:44:38.017761 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n58zm"] Oct 01 13:44:38 crc kubenswrapper[4749]: I1001 13:44:38.621106 4749 generic.go:334] "Generic (PLEG): container finished" podID="eca929ee-fc6d-4197-a1cd-acb05e10a6ba" containerID="6c30ede4b51922747d75e1ac448a707e4aac40410b5fd06cd60261a2bc82c471" exitCode=0 Oct 01 13:44:38 crc kubenswrapper[4749]: I1001 13:44:38.621331 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjwzm" event={"ID":"eca929ee-fc6d-4197-a1cd-acb05e10a6ba","Type":"ContainerDied","Data":"6c30ede4b51922747d75e1ac448a707e4aac40410b5fd06cd60261a2bc82c471"} Oct 01 13:44:38 crc kubenswrapper[4749]: I1001 13:44:38.621540 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjwzm" event={"ID":"eca929ee-fc6d-4197-a1cd-acb05e10a6ba","Type":"ContainerStarted","Data":"4e49fa412762275961024c64edd4c69e424f4b0259b20ff073f08c44a7c93f57"} Oct 01 13:44:38 crc kubenswrapper[4749]: I1001 13:44:38.623354 4749 generic.go:334] "Generic (PLEG): container finished" podID="c0d6ca94-68a9-4233-90a0-83998c687856" containerID="78a5b10ce6f14c9914ea1fa4d3399b82711290822a46db3abb56638ca16e9b9d" exitCode=0 Oct 01 13:44:38 crc kubenswrapper[4749]: I1001 13:44:38.623388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n58zm" event={"ID":"c0d6ca94-68a9-4233-90a0-83998c687856","Type":"ContainerDied","Data":"78a5b10ce6f14c9914ea1fa4d3399b82711290822a46db3abb56638ca16e9b9d"} Oct 01 13:44:38 crc kubenswrapper[4749]: I1001 13:44:38.623416 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n58zm" event={"ID":"c0d6ca94-68a9-4233-90a0-83998c687856","Type":"ContainerStarted","Data":"05d1f1f22f1b0ab910e10125c3a8a1e9ead947d2b4e4281b169fe6be72a3e5f0"} Oct 01 13:44:39 crc kubenswrapper[4749]: I1001 13:44:39.230536 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:44:39 crc kubenswrapper[4749]: E1001 13:44:39.230781 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:44:40 crc kubenswrapper[4749]: I1001 13:44:40.648180 4749 generic.go:334] "Generic (PLEG): container finished" podID="eca929ee-fc6d-4197-a1cd-acb05e10a6ba" containerID="ad5e367a9ce18e78df3453c503f4b598e4849e2e6431c11b191a3f64a3edb783" exitCode=0 Oct 01 13:44:40 crc kubenswrapper[4749]: I1001 13:44:40.648779 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjwzm" event={"ID":"eca929ee-fc6d-4197-a1cd-acb05e10a6ba","Type":"ContainerDied","Data":"ad5e367a9ce18e78df3453c503f4b598e4849e2e6431c11b191a3f64a3edb783"} Oct 01 13:44:40 crc kubenswrapper[4749]: I1001 13:44:40.653917 4749 generic.go:334] "Generic (PLEG): container finished" podID="c0d6ca94-68a9-4233-90a0-83998c687856" containerID="c05c2b4b0cd7f18316a9996b9da94ac291af4039628208b437b0d31922a4d1cf" exitCode=0 Oct 01 13:44:40 crc kubenswrapper[4749]: I1001 13:44:40.653952 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n58zm" event={"ID":"c0d6ca94-68a9-4233-90a0-83998c687856","Type":"ContainerDied","Data":"c05c2b4b0cd7f18316a9996b9da94ac291af4039628208b437b0d31922a4d1cf"} Oct 01 13:44:41 crc kubenswrapper[4749]: I1001 13:44:41.665366 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n58zm" event={"ID":"c0d6ca94-68a9-4233-90a0-83998c687856","Type":"ContainerStarted","Data":"d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219"} Oct 01 13:44:41 crc kubenswrapper[4749]: I1001 13:44:41.667296 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjwzm" event={"ID":"eca929ee-fc6d-4197-a1cd-acb05e10a6ba","Type":"ContainerStarted","Data":"a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b"} Oct 01 13:44:41 crc kubenswrapper[4749]: I1001 13:44:41.688904 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n58zm" podStartSLOduration=2.133885636 podStartE2EDuration="4.688869941s" podCreationTimestamp="2025-10-01 13:44:37 +0000 UTC" firstStartedPulling="2025-10-01 13:44:38.625056651 +0000 UTC m=+2338.679041540" lastFinishedPulling="2025-10-01 13:44:41.180040906 +0000 UTC m=+2341.234025845" observedRunningTime="2025-10-01 13:44:41.680325975 +0000 UTC m=+2341.734310884" watchObservedRunningTime="2025-10-01 13:44:41.688869941 +0000 UTC m=+2341.742854850" Oct 01 13:44:41 crc kubenswrapper[4749]: I1001 13:44:41.705352 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bjwzm" podStartSLOduration=3.140621779 podStartE2EDuration="5.705327636s" podCreationTimestamp="2025-10-01 13:44:36 +0000 UTC" firstStartedPulling="2025-10-01 13:44:38.623794724 +0000 UTC m=+2338.677779623" lastFinishedPulling="2025-10-01 13:44:41.188500541 +0000 UTC m=+2341.242485480" observedRunningTime="2025-10-01 13:44:41.700151787 +0000 UTC m=+2341.754136706" watchObservedRunningTime="2025-10-01 13:44:41.705327636 +0000 UTC m=+2341.759312545" Oct 01 13:44:47 crc kubenswrapper[4749]: I1001 13:44:47.244680 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:47 crc kubenswrapper[4749]: I1001 13:44:47.245327 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:47 crc kubenswrapper[4749]: I1001 13:44:47.286974 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:47 crc kubenswrapper[4749]: I1001 13:44:47.418925 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:47 crc kubenswrapper[4749]: I1001 13:44:47.419305 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:47 crc kubenswrapper[4749]: I1001 13:44:47.462634 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:47 crc kubenswrapper[4749]: I1001 13:44:47.784897 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:47 crc kubenswrapper[4749]: I1001 13:44:47.792393 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:49 crc kubenswrapper[4749]: I1001 13:44:49.136611 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bjwzm"] Oct 01 13:44:49 crc kubenswrapper[4749]: I1001 13:44:49.754183 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bjwzm" podUID="eca929ee-fc6d-4197-a1cd-acb05e10a6ba" containerName="registry-server" containerID="cri-o://a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b" gracePeriod=2 Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.125201 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n58zm"] Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.224106 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.417994 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-utilities\") pod \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\" (UID: \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\") " Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.418287 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-catalog-content\") pod \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\" (UID: \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\") " Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.418374 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4csw\" (UniqueName: \"kubernetes.io/projected/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-kube-api-access-l4csw\") pod \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\" (UID: \"eca929ee-fc6d-4197-a1cd-acb05e10a6ba\") " Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.419858 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-utilities" (OuterVolumeSpecName: "utilities") pod "eca929ee-fc6d-4197-a1cd-acb05e10a6ba" (UID: "eca929ee-fc6d-4197-a1cd-acb05e10a6ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.424626 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-kube-api-access-l4csw" (OuterVolumeSpecName: "kube-api-access-l4csw") pod "eca929ee-fc6d-4197-a1cd-acb05e10a6ba" (UID: "eca929ee-fc6d-4197-a1cd-acb05e10a6ba"). InnerVolumeSpecName "kube-api-access-l4csw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.504844 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eca929ee-fc6d-4197-a1cd-acb05e10a6ba" (UID: "eca929ee-fc6d-4197-a1cd-acb05e10a6ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.521498 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.521531 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.521545 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4csw\" (UniqueName: \"kubernetes.io/projected/eca929ee-fc6d-4197-a1cd-acb05e10a6ba-kube-api-access-l4csw\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.769655 4749 generic.go:334] "Generic (PLEG): container finished" podID="eca929ee-fc6d-4197-a1cd-acb05e10a6ba" containerID="a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b" exitCode=0 Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.769711 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bjwzm" Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.769780 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjwzm" event={"ID":"eca929ee-fc6d-4197-a1cd-acb05e10a6ba","Type":"ContainerDied","Data":"a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b"} Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.769837 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bjwzm" event={"ID":"eca929ee-fc6d-4197-a1cd-acb05e10a6ba","Type":"ContainerDied","Data":"4e49fa412762275961024c64edd4c69e424f4b0259b20ff073f08c44a7c93f57"} Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.769862 4749 scope.go:117] "RemoveContainer" containerID="a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b" Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.770285 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n58zm" podUID="c0d6ca94-68a9-4233-90a0-83998c687856" containerName="registry-server" containerID="cri-o://d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219" gracePeriod=2 Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.808100 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bjwzm"] Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.810667 4749 scope.go:117] "RemoveContainer" containerID="ad5e367a9ce18e78df3453c503f4b598e4849e2e6431c11b191a3f64a3edb783" Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.817415 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bjwzm"] Oct 01 13:44:50 crc kubenswrapper[4749]: I1001 13:44:50.909255 4749 scope.go:117] "RemoveContainer" containerID="6c30ede4b51922747d75e1ac448a707e4aac40410b5fd06cd60261a2bc82c471" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.027996 4749 scope.go:117] "RemoveContainer" containerID="a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b" Oct 01 13:44:51 crc kubenswrapper[4749]: E1001 13:44:51.030372 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b\": container with ID starting with a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b not found: ID does not exist" containerID="a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.030411 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b"} err="failed to get container status \"a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b\": rpc error: code = NotFound desc = could not find container \"a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b\": container with ID starting with a80764b8dab064c0fefbedd1dab341aedb471801b4328e95f74fce18bcee3b6b not found: ID does not exist" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.030435 4749 scope.go:117] "RemoveContainer" containerID="ad5e367a9ce18e78df3453c503f4b598e4849e2e6431c11b191a3f64a3edb783" Oct 01 13:44:51 crc kubenswrapper[4749]: E1001 13:44:51.030797 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad5e367a9ce18e78df3453c503f4b598e4849e2e6431c11b191a3f64a3edb783\": container with ID starting with ad5e367a9ce18e78df3453c503f4b598e4849e2e6431c11b191a3f64a3edb783 not found: ID does not exist" containerID="ad5e367a9ce18e78df3453c503f4b598e4849e2e6431c11b191a3f64a3edb783" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.030850 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5e367a9ce18e78df3453c503f4b598e4849e2e6431c11b191a3f64a3edb783"} err="failed to get container status \"ad5e367a9ce18e78df3453c503f4b598e4849e2e6431c11b191a3f64a3edb783\": rpc error: code = NotFound desc = could not find container \"ad5e367a9ce18e78df3453c503f4b598e4849e2e6431c11b191a3f64a3edb783\": container with ID starting with ad5e367a9ce18e78df3453c503f4b598e4849e2e6431c11b191a3f64a3edb783 not found: ID does not exist" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.030881 4749 scope.go:117] "RemoveContainer" containerID="6c30ede4b51922747d75e1ac448a707e4aac40410b5fd06cd60261a2bc82c471" Oct 01 13:44:51 crc kubenswrapper[4749]: E1001 13:44:51.031169 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c30ede4b51922747d75e1ac448a707e4aac40410b5fd06cd60261a2bc82c471\": container with ID starting with 6c30ede4b51922747d75e1ac448a707e4aac40410b5fd06cd60261a2bc82c471 not found: ID does not exist" containerID="6c30ede4b51922747d75e1ac448a707e4aac40410b5fd06cd60261a2bc82c471" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.031202 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c30ede4b51922747d75e1ac448a707e4aac40410b5fd06cd60261a2bc82c471"} err="failed to get container status \"6c30ede4b51922747d75e1ac448a707e4aac40410b5fd06cd60261a2bc82c471\": rpc error: code = NotFound desc = could not find container \"6c30ede4b51922747d75e1ac448a707e4aac40410b5fd06cd60261a2bc82c471\": container with ID starting with 6c30ede4b51922747d75e1ac448a707e4aac40410b5fd06cd60261a2bc82c471 not found: ID does not exist" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.243097 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca929ee-fc6d-4197-a1cd-acb05e10a6ba" path="/var/lib/kubelet/pods/eca929ee-fc6d-4197-a1cd-acb05e10a6ba/volumes" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.272936 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.438122 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d6ca94-68a9-4233-90a0-83998c687856-catalog-content\") pod \"c0d6ca94-68a9-4233-90a0-83998c687856\" (UID: \"c0d6ca94-68a9-4233-90a0-83998c687856\") " Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.438200 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d6ca94-68a9-4233-90a0-83998c687856-utilities\") pod \"c0d6ca94-68a9-4233-90a0-83998c687856\" (UID: \"c0d6ca94-68a9-4233-90a0-83998c687856\") " Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.438694 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79725\" (UniqueName: \"kubernetes.io/projected/c0d6ca94-68a9-4233-90a0-83998c687856-kube-api-access-79725\") pod \"c0d6ca94-68a9-4233-90a0-83998c687856\" (UID: \"c0d6ca94-68a9-4233-90a0-83998c687856\") " Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.439877 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0d6ca94-68a9-4233-90a0-83998c687856-utilities" (OuterVolumeSpecName: "utilities") pod "c0d6ca94-68a9-4233-90a0-83998c687856" (UID: "c0d6ca94-68a9-4233-90a0-83998c687856"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.444101 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0d6ca94-68a9-4233-90a0-83998c687856-kube-api-access-79725" (OuterVolumeSpecName: "kube-api-access-79725") pod "c0d6ca94-68a9-4233-90a0-83998c687856" (UID: "c0d6ca94-68a9-4233-90a0-83998c687856"). InnerVolumeSpecName "kube-api-access-79725". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.452433 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0d6ca94-68a9-4233-90a0-83998c687856-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0d6ca94-68a9-4233-90a0-83998c687856" (UID: "c0d6ca94-68a9-4233-90a0-83998c687856"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.540834 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79725\" (UniqueName: \"kubernetes.io/projected/c0d6ca94-68a9-4233-90a0-83998c687856-kube-api-access-79725\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.541094 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0d6ca94-68a9-4233-90a0-83998c687856-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.541184 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0d6ca94-68a9-4233-90a0-83998c687856-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.786523 4749 generic.go:334] "Generic (PLEG): container finished" podID="c0d6ca94-68a9-4233-90a0-83998c687856" containerID="d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219" exitCode=0 Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.786576 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n58zm" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.786569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n58zm" event={"ID":"c0d6ca94-68a9-4233-90a0-83998c687856","Type":"ContainerDied","Data":"d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219"} Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.786725 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n58zm" event={"ID":"c0d6ca94-68a9-4233-90a0-83998c687856","Type":"ContainerDied","Data":"05d1f1f22f1b0ab910e10125c3a8a1e9ead947d2b4e4281b169fe6be72a3e5f0"} Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.786750 4749 scope.go:117] "RemoveContainer" containerID="d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.822082 4749 scope.go:117] "RemoveContainer" containerID="c05c2b4b0cd7f18316a9996b9da94ac291af4039628208b437b0d31922a4d1cf" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.835472 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n58zm"] Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.848798 4749 scope.go:117] "RemoveContainer" containerID="78a5b10ce6f14c9914ea1fa4d3399b82711290822a46db3abb56638ca16e9b9d" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.854563 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n58zm"] Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.877054 4749 scope.go:117] "RemoveContainer" containerID="d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219" Oct 01 13:44:51 crc kubenswrapper[4749]: E1001 13:44:51.877557 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219\": container with ID starting with d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219 not found: ID does not exist" containerID="d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.877621 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219"} err="failed to get container status \"d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219\": rpc error: code = NotFound desc = could not find container \"d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219\": container with ID starting with d9c558d95061f209b0131345dfb770e198a972d6ef80bdfed62eed5c734d4219 not found: ID does not exist" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.877669 4749 scope.go:117] "RemoveContainer" containerID="c05c2b4b0cd7f18316a9996b9da94ac291af4039628208b437b0d31922a4d1cf" Oct 01 13:44:51 crc kubenswrapper[4749]: E1001 13:44:51.878010 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c05c2b4b0cd7f18316a9996b9da94ac291af4039628208b437b0d31922a4d1cf\": container with ID starting with c05c2b4b0cd7f18316a9996b9da94ac291af4039628208b437b0d31922a4d1cf not found: ID does not exist" containerID="c05c2b4b0cd7f18316a9996b9da94ac291af4039628208b437b0d31922a4d1cf" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.878040 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c05c2b4b0cd7f18316a9996b9da94ac291af4039628208b437b0d31922a4d1cf"} err="failed to get container status \"c05c2b4b0cd7f18316a9996b9da94ac291af4039628208b437b0d31922a4d1cf\": rpc error: code = NotFound desc = could not find container \"c05c2b4b0cd7f18316a9996b9da94ac291af4039628208b437b0d31922a4d1cf\": container with ID starting with c05c2b4b0cd7f18316a9996b9da94ac291af4039628208b437b0d31922a4d1cf not found: ID does not exist" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.878061 4749 scope.go:117] "RemoveContainer" containerID="78a5b10ce6f14c9914ea1fa4d3399b82711290822a46db3abb56638ca16e9b9d" Oct 01 13:44:51 crc kubenswrapper[4749]: E1001 13:44:51.878497 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78a5b10ce6f14c9914ea1fa4d3399b82711290822a46db3abb56638ca16e9b9d\": container with ID starting with 78a5b10ce6f14c9914ea1fa4d3399b82711290822a46db3abb56638ca16e9b9d not found: ID does not exist" containerID="78a5b10ce6f14c9914ea1fa4d3399b82711290822a46db3abb56638ca16e9b9d" Oct 01 13:44:51 crc kubenswrapper[4749]: I1001 13:44:51.878530 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78a5b10ce6f14c9914ea1fa4d3399b82711290822a46db3abb56638ca16e9b9d"} err="failed to get container status \"78a5b10ce6f14c9914ea1fa4d3399b82711290822a46db3abb56638ca16e9b9d\": rpc error: code = NotFound desc = could not find container \"78a5b10ce6f14c9914ea1fa4d3399b82711290822a46db3abb56638ca16e9b9d\": container with ID starting with 78a5b10ce6f14c9914ea1fa4d3399b82711290822a46db3abb56638ca16e9b9d not found: ID does not exist" Oct 01 13:44:53 crc kubenswrapper[4749]: I1001 13:44:53.229698 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:44:53 crc kubenswrapper[4749]: E1001 13:44:53.230076 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:44:53 crc kubenswrapper[4749]: I1001 13:44:53.244997 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0d6ca94-68a9-4233-90a0-83998c687856" path="/var/lib/kubelet/pods/c0d6ca94-68a9-4233-90a0-83998c687856/volumes" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.157449 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx"] Oct 01 13:45:00 crc kubenswrapper[4749]: E1001 13:45:00.158484 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d6ca94-68a9-4233-90a0-83998c687856" containerName="extract-utilities" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.158503 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d6ca94-68a9-4233-90a0-83998c687856" containerName="extract-utilities" Oct 01 13:45:00 crc kubenswrapper[4749]: E1001 13:45:00.158519 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d6ca94-68a9-4233-90a0-83998c687856" containerName="registry-server" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.158527 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d6ca94-68a9-4233-90a0-83998c687856" containerName="registry-server" Oct 01 13:45:00 crc kubenswrapper[4749]: E1001 13:45:00.158541 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca929ee-fc6d-4197-a1cd-acb05e10a6ba" containerName="extract-utilities" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.158549 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca929ee-fc6d-4197-a1cd-acb05e10a6ba" containerName="extract-utilities" Oct 01 13:45:00 crc kubenswrapper[4749]: E1001 13:45:00.158563 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d6ca94-68a9-4233-90a0-83998c687856" containerName="extract-content" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.158571 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d6ca94-68a9-4233-90a0-83998c687856" containerName="extract-content" Oct 01 13:45:00 crc kubenswrapper[4749]: E1001 13:45:00.158614 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca929ee-fc6d-4197-a1cd-acb05e10a6ba" containerName="registry-server" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.158622 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca929ee-fc6d-4197-a1cd-acb05e10a6ba" containerName="registry-server" Oct 01 13:45:00 crc kubenswrapper[4749]: E1001 13:45:00.158643 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca929ee-fc6d-4197-a1cd-acb05e10a6ba" containerName="extract-content" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.158651 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca929ee-fc6d-4197-a1cd-acb05e10a6ba" containerName="extract-content" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.158904 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d6ca94-68a9-4233-90a0-83998c687856" containerName="registry-server" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.158942 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca929ee-fc6d-4197-a1cd-acb05e10a6ba" containerName="registry-server" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.160792 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.164167 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.164702 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.165882 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx"] Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.235523 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xms6\" (UniqueName: \"kubernetes.io/projected/a896be31-cf94-4d34-828a-c386800fb02c-kube-api-access-2xms6\") pod \"collect-profiles-29322105-mddtx\" (UID: \"a896be31-cf94-4d34-828a-c386800fb02c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.235616 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a896be31-cf94-4d34-828a-c386800fb02c-secret-volume\") pod \"collect-profiles-29322105-mddtx\" (UID: \"a896be31-cf94-4d34-828a-c386800fb02c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.235777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a896be31-cf94-4d34-828a-c386800fb02c-config-volume\") pod \"collect-profiles-29322105-mddtx\" (UID: \"a896be31-cf94-4d34-828a-c386800fb02c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.338041 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xms6\" (UniqueName: \"kubernetes.io/projected/a896be31-cf94-4d34-828a-c386800fb02c-kube-api-access-2xms6\") pod \"collect-profiles-29322105-mddtx\" (UID: \"a896be31-cf94-4d34-828a-c386800fb02c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.338433 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a896be31-cf94-4d34-828a-c386800fb02c-secret-volume\") pod \"collect-profiles-29322105-mddtx\" (UID: \"a896be31-cf94-4d34-828a-c386800fb02c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.338561 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a896be31-cf94-4d34-828a-c386800fb02c-config-volume\") pod \"collect-profiles-29322105-mddtx\" (UID: \"a896be31-cf94-4d34-828a-c386800fb02c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.339403 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a896be31-cf94-4d34-828a-c386800fb02c-config-volume\") pod \"collect-profiles-29322105-mddtx\" (UID: \"a896be31-cf94-4d34-828a-c386800fb02c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.345145 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a896be31-cf94-4d34-828a-c386800fb02c-secret-volume\") pod \"collect-profiles-29322105-mddtx\" (UID: \"a896be31-cf94-4d34-828a-c386800fb02c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.361286 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xms6\" (UniqueName: \"kubernetes.io/projected/a896be31-cf94-4d34-828a-c386800fb02c-kube-api-access-2xms6\") pod \"collect-profiles-29322105-mddtx\" (UID: \"a896be31-cf94-4d34-828a-c386800fb02c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:00 crc kubenswrapper[4749]: I1001 13:45:00.527354 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:01 crc kubenswrapper[4749]: I1001 13:45:01.027597 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx"] Oct 01 13:45:01 crc kubenswrapper[4749]: I1001 13:45:01.906601 4749 generic.go:334] "Generic (PLEG): container finished" podID="a896be31-cf94-4d34-828a-c386800fb02c" containerID="e1b0221febadd199a13554849d73c3da37b3fd0b6b3c53cb226b7ff4683b2036" exitCode=0 Oct 01 13:45:01 crc kubenswrapper[4749]: I1001 13:45:01.906940 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" event={"ID":"a896be31-cf94-4d34-828a-c386800fb02c","Type":"ContainerDied","Data":"e1b0221febadd199a13554849d73c3da37b3fd0b6b3c53cb226b7ff4683b2036"} Oct 01 13:45:01 crc kubenswrapper[4749]: I1001 13:45:01.906980 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" event={"ID":"a896be31-cf94-4d34-828a-c386800fb02c","Type":"ContainerStarted","Data":"d4f850b3108c4828a6c97da9b84934fb93140d22f7806f649fef8d1725180a82"} Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.292017 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.396372 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a896be31-cf94-4d34-828a-c386800fb02c-config-volume\") pod \"a896be31-cf94-4d34-828a-c386800fb02c\" (UID: \"a896be31-cf94-4d34-828a-c386800fb02c\") " Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.396692 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xms6\" (UniqueName: \"kubernetes.io/projected/a896be31-cf94-4d34-828a-c386800fb02c-kube-api-access-2xms6\") pod \"a896be31-cf94-4d34-828a-c386800fb02c\" (UID: \"a896be31-cf94-4d34-828a-c386800fb02c\") " Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.396783 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a896be31-cf94-4d34-828a-c386800fb02c-secret-volume\") pod \"a896be31-cf94-4d34-828a-c386800fb02c\" (UID: \"a896be31-cf94-4d34-828a-c386800fb02c\") " Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.397246 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a896be31-cf94-4d34-828a-c386800fb02c-config-volume" (OuterVolumeSpecName: "config-volume") pod "a896be31-cf94-4d34-828a-c386800fb02c" (UID: "a896be31-cf94-4d34-828a-c386800fb02c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.402013 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a896be31-cf94-4d34-828a-c386800fb02c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a896be31-cf94-4d34-828a-c386800fb02c" (UID: "a896be31-cf94-4d34-828a-c386800fb02c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.403598 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a896be31-cf94-4d34-828a-c386800fb02c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.403702 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a896be31-cf94-4d34-828a-c386800fb02c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.412109 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a896be31-cf94-4d34-828a-c386800fb02c-kube-api-access-2xms6" (OuterVolumeSpecName: "kube-api-access-2xms6") pod "a896be31-cf94-4d34-828a-c386800fb02c" (UID: "a896be31-cf94-4d34-828a-c386800fb02c"). InnerVolumeSpecName "kube-api-access-2xms6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.505859 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xms6\" (UniqueName: \"kubernetes.io/projected/a896be31-cf94-4d34-828a-c386800fb02c-kube-api-access-2xms6\") on node \"crc\" DevicePath \"\"" Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.935415 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" event={"ID":"a896be31-cf94-4d34-828a-c386800fb02c","Type":"ContainerDied","Data":"d4f850b3108c4828a6c97da9b84934fb93140d22f7806f649fef8d1725180a82"} Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.935455 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4f850b3108c4828a6c97da9b84934fb93140d22f7806f649fef8d1725180a82" Oct 01 13:45:03 crc kubenswrapper[4749]: I1001 13:45:03.935507 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx" Oct 01 13:45:04 crc kubenswrapper[4749]: I1001 13:45:04.398759 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx"] Oct 01 13:45:04 crc kubenswrapper[4749]: I1001 13:45:04.412572 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322060-6hjrx"] Oct 01 13:45:05 crc kubenswrapper[4749]: I1001 13:45:05.251263 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5321a24-d271-46cb-9d0a-fde8089a6ddc" path="/var/lib/kubelet/pods/d5321a24-d271-46cb-9d0a-fde8089a6ddc/volumes" Oct 01 13:45:07 crc kubenswrapper[4749]: I1001 13:45:07.230515 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:45:07 crc kubenswrapper[4749]: E1001 13:45:07.232865 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:45:09 crc kubenswrapper[4749]: I1001 13:45:09.635279 4749 scope.go:117] "RemoveContainer" containerID="d09e4abe7c9b199e38017b52d57c783aac5ad187ac71bb789a2d4d0f1d648826" Oct 01 13:45:22 crc kubenswrapper[4749]: I1001 13:45:22.229931 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:45:22 crc kubenswrapper[4749]: E1001 13:45:22.230799 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:45:36 crc kubenswrapper[4749]: I1001 13:45:36.230567 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:45:36 crc kubenswrapper[4749]: E1001 13:45:36.231336 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:45:50 crc kubenswrapper[4749]: I1001 13:45:50.230208 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:45:50 crc kubenswrapper[4749]: E1001 13:45:50.230996 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:46:01 crc kubenswrapper[4749]: I1001 13:46:01.237522 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:46:01 crc kubenswrapper[4749]: E1001 13:46:01.238343 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:46:13 crc kubenswrapper[4749]: I1001 13:46:13.230513 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:46:13 crc kubenswrapper[4749]: E1001 13:46:13.231335 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:46:27 crc kubenswrapper[4749]: I1001 13:46:27.230102 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:46:27 crc kubenswrapper[4749]: E1001 13:46:27.231183 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:46:38 crc kubenswrapper[4749]: I1001 13:46:38.229829 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:46:38 crc kubenswrapper[4749]: E1001 13:46:38.230943 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:46:49 crc kubenswrapper[4749]: I1001 13:46:49.230272 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:46:49 crc kubenswrapper[4749]: E1001 13:46:49.231175 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:47:03 crc kubenswrapper[4749]: I1001 13:47:03.230626 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:47:03 crc kubenswrapper[4749]: E1001 13:47:03.231387 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:47:17 crc kubenswrapper[4749]: I1001 13:47:17.231008 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:47:17 crc kubenswrapper[4749]: E1001 13:47:17.233272 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:47:29 crc kubenswrapper[4749]: I1001 13:47:29.231160 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:47:29 crc kubenswrapper[4749]: E1001 13:47:29.232267 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:47:40 crc kubenswrapper[4749]: I1001 13:47:40.230820 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:47:40 crc kubenswrapper[4749]: E1001 13:47:40.231958 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:47:51 crc kubenswrapper[4749]: I1001 13:47:51.230023 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:47:51 crc kubenswrapper[4749]: E1001 13:47:51.231353 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:48:02 crc kubenswrapper[4749]: I1001 13:48:02.230143 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:48:02 crc kubenswrapper[4749]: I1001 13:48:02.922090 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"42dfa798e693af341ed4d4d0c7dda7977e057acf005434caadc63a2b7f7ef228"} Oct 01 13:49:00 crc kubenswrapper[4749]: I1001 13:49:00.577876 4749 generic.go:334] "Generic (PLEG): container finished" podID="f2871c6b-b170-4396-8c0b-be0ac02c1b48" containerID="a3a2d39a82a5834d8300d97d2195aa798622024b6f37ded91c553646297e443c" exitCode=0 Oct 01 13:49:00 crc kubenswrapper[4749]: I1001 13:49:00.577996 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" event={"ID":"f2871c6b-b170-4396-8c0b-be0ac02c1b48","Type":"ContainerDied","Data":"a3a2d39a82a5834d8300d97d2195aa798622024b6f37ded91c553646297e443c"} Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.047196 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.205664 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-ssh-key\") pod \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.205890 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-libvirt-combined-ca-bundle\") pod \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.205978 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-inventory\") pod \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.206019 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-libvirt-secret-0\") pod \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.206138 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r2vv\" (UniqueName: \"kubernetes.io/projected/f2871c6b-b170-4396-8c0b-be0ac02c1b48-kube-api-access-4r2vv\") pod \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\" (UID: \"f2871c6b-b170-4396-8c0b-be0ac02c1b48\") " Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.216387 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2871c6b-b170-4396-8c0b-be0ac02c1b48-kube-api-access-4r2vv" (OuterVolumeSpecName: "kube-api-access-4r2vv") pod "f2871c6b-b170-4396-8c0b-be0ac02c1b48" (UID: "f2871c6b-b170-4396-8c0b-be0ac02c1b48"). InnerVolumeSpecName "kube-api-access-4r2vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.217687 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f2871c6b-b170-4396-8c0b-be0ac02c1b48" (UID: "f2871c6b-b170-4396-8c0b-be0ac02c1b48"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.238658 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-inventory" (OuterVolumeSpecName: "inventory") pod "f2871c6b-b170-4396-8c0b-be0ac02c1b48" (UID: "f2871c6b-b170-4396-8c0b-be0ac02c1b48"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.240928 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "f2871c6b-b170-4396-8c0b-be0ac02c1b48" (UID: "f2871c6b-b170-4396-8c0b-be0ac02c1b48"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.251477 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2871c6b-b170-4396-8c0b-be0ac02c1b48" (UID: "f2871c6b-b170-4396-8c0b-be0ac02c1b48"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.308376 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.308455 4749 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.308469 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r2vv\" (UniqueName: \"kubernetes.io/projected/f2871c6b-b170-4396-8c0b-be0ac02c1b48-kube-api-access-4r2vv\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.308481 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.308492 4749 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2871c6b-b170-4396-8c0b-be0ac02c1b48-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.618791 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" event={"ID":"f2871c6b-b170-4396-8c0b-be0ac02c1b48","Type":"ContainerDied","Data":"15081485705df72e136b107468a18132a6e16f050f611c64701e5551d0e58518"} Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.619137 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15081485705df72e136b107468a18132a6e16f050f611c64701e5551d0e58518" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.619346 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bksd6" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.718110 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg"] Oct 01 13:49:02 crc kubenswrapper[4749]: E1001 13:49:02.718505 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a896be31-cf94-4d34-828a-c386800fb02c" containerName="collect-profiles" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.718521 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a896be31-cf94-4d34-828a-c386800fb02c" containerName="collect-profiles" Oct 01 13:49:02 crc kubenswrapper[4749]: E1001 13:49:02.718558 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2871c6b-b170-4396-8c0b-be0ac02c1b48" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.718566 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2871c6b-b170-4396-8c0b-be0ac02c1b48" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.718741 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a896be31-cf94-4d34-828a-c386800fb02c" containerName="collect-profiles" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.718757 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2871c6b-b170-4396-8c0b-be0ac02c1b48" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.719416 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.721777 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.722238 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.722305 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.722326 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.722750 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.722980 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.723079 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.739668 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg"] Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.816920 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.816984 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.817044 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.817258 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.817545 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.817623 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.817672 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.817743 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.817794 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk9lq\" (UniqueName: \"kubernetes.io/projected/848e191d-2e82-41af-8368-7c9c7e7b200e-kube-api-access-vk9lq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.920363 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.920519 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.920680 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.920767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.920811 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.920850 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk9lq\" (UniqueName: \"kubernetes.io/projected/848e191d-2e82-41af-8368-7c9c7e7b200e-kube-api-access-vk9lq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.920920 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.920989 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.921094 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.922015 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.926119 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.926475 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.926474 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.926637 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.926886 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.928905 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.938783 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:02 crc kubenswrapper[4749]: I1001 13:49:02.942063 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk9lq\" (UniqueName: \"kubernetes.io/projected/848e191d-2e82-41af-8368-7c9c7e7b200e-kube-api-access-vk9lq\") pod \"nova-edpm-deployment-openstack-edpm-ipam-n4plg\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:03 crc kubenswrapper[4749]: I1001 13:49:03.046495 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:49:03 crc kubenswrapper[4749]: I1001 13:49:03.585475 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg"] Oct 01 13:49:03 crc kubenswrapper[4749]: I1001 13:49:03.586870 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:49:03 crc kubenswrapper[4749]: I1001 13:49:03.632026 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" event={"ID":"848e191d-2e82-41af-8368-7c9c7e7b200e","Type":"ContainerStarted","Data":"7995f5e1d991a65e53e62ddb16eb36ad557c0107c9d7068e2f4f9d1599ea4406"} Oct 01 13:49:04 crc kubenswrapper[4749]: I1001 13:49:04.655833 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" event={"ID":"848e191d-2e82-41af-8368-7c9c7e7b200e","Type":"ContainerStarted","Data":"7b65e74784f9ddc6d2e8edf65387f26bd369be066ef50235fcdad5d687f2983f"} Oct 01 13:49:04 crc kubenswrapper[4749]: I1001 13:49:04.679149 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" podStartSLOduration=2.224940129 podStartE2EDuration="2.679129399s" podCreationTimestamp="2025-10-01 13:49:02 +0000 UTC" firstStartedPulling="2025-10-01 13:49:03.58662526 +0000 UTC m=+2603.640610169" lastFinishedPulling="2025-10-01 13:49:04.04081454 +0000 UTC m=+2604.094799439" observedRunningTime="2025-10-01 13:49:04.669500753 +0000 UTC m=+2604.723485672" watchObservedRunningTime="2025-10-01 13:49:04.679129399 +0000 UTC m=+2604.733114298" Oct 01 13:50:02 crc kubenswrapper[4749]: I1001 13:50:02.106756 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:50:02 crc kubenswrapper[4749]: I1001 13:50:02.107695 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:50:15 crc kubenswrapper[4749]: I1001 13:50:15.874212 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gs4wc"] Oct 01 13:50:15 crc kubenswrapper[4749]: I1001 13:50:15.877770 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:15 crc kubenswrapper[4749]: I1001 13:50:15.924782 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gs4wc"] Oct 01 13:50:15 crc kubenswrapper[4749]: I1001 13:50:15.973481 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/036a07d0-48a5-4b25-9b37-0a854960006d-catalog-content\") pod \"redhat-operators-gs4wc\" (UID: \"036a07d0-48a5-4b25-9b37-0a854960006d\") " pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:15 crc kubenswrapper[4749]: I1001 13:50:15.973875 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4gf2\" (UniqueName: \"kubernetes.io/projected/036a07d0-48a5-4b25-9b37-0a854960006d-kube-api-access-k4gf2\") pod \"redhat-operators-gs4wc\" (UID: \"036a07d0-48a5-4b25-9b37-0a854960006d\") " pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:15 crc kubenswrapper[4749]: I1001 13:50:15.973955 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/036a07d0-48a5-4b25-9b37-0a854960006d-utilities\") pod \"redhat-operators-gs4wc\" (UID: \"036a07d0-48a5-4b25-9b37-0a854960006d\") " pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:16 crc kubenswrapper[4749]: I1001 13:50:16.076102 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4gf2\" (UniqueName: \"kubernetes.io/projected/036a07d0-48a5-4b25-9b37-0a854960006d-kube-api-access-k4gf2\") pod \"redhat-operators-gs4wc\" (UID: \"036a07d0-48a5-4b25-9b37-0a854960006d\") " pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:16 crc kubenswrapper[4749]: I1001 13:50:16.076152 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/036a07d0-48a5-4b25-9b37-0a854960006d-utilities\") pod \"redhat-operators-gs4wc\" (UID: \"036a07d0-48a5-4b25-9b37-0a854960006d\") " pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:16 crc kubenswrapper[4749]: I1001 13:50:16.076227 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/036a07d0-48a5-4b25-9b37-0a854960006d-catalog-content\") pod \"redhat-operators-gs4wc\" (UID: \"036a07d0-48a5-4b25-9b37-0a854960006d\") " pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:16 crc kubenswrapper[4749]: I1001 13:50:16.076680 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/036a07d0-48a5-4b25-9b37-0a854960006d-utilities\") pod \"redhat-operators-gs4wc\" (UID: \"036a07d0-48a5-4b25-9b37-0a854960006d\") " pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:16 crc kubenswrapper[4749]: I1001 13:50:16.076854 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/036a07d0-48a5-4b25-9b37-0a854960006d-catalog-content\") pod \"redhat-operators-gs4wc\" (UID: \"036a07d0-48a5-4b25-9b37-0a854960006d\") " pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:16 crc kubenswrapper[4749]: I1001 13:50:16.101892 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4gf2\" (UniqueName: \"kubernetes.io/projected/036a07d0-48a5-4b25-9b37-0a854960006d-kube-api-access-k4gf2\") pod \"redhat-operators-gs4wc\" (UID: \"036a07d0-48a5-4b25-9b37-0a854960006d\") " pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:16 crc kubenswrapper[4749]: I1001 13:50:16.202554 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:16 crc kubenswrapper[4749]: I1001 13:50:16.740012 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gs4wc"] Oct 01 13:50:17 crc kubenswrapper[4749]: I1001 13:50:17.452775 4749 generic.go:334] "Generic (PLEG): container finished" podID="036a07d0-48a5-4b25-9b37-0a854960006d" containerID="010ce65e636d46e8e8377f931a1e719b1576e5ad0bf0443f3203ab16066ca321" exitCode=0 Oct 01 13:50:17 crc kubenswrapper[4749]: I1001 13:50:17.452925 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs4wc" event={"ID":"036a07d0-48a5-4b25-9b37-0a854960006d","Type":"ContainerDied","Data":"010ce65e636d46e8e8377f931a1e719b1576e5ad0bf0443f3203ab16066ca321"} Oct 01 13:50:17 crc kubenswrapper[4749]: I1001 13:50:17.453170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs4wc" event={"ID":"036a07d0-48a5-4b25-9b37-0a854960006d","Type":"ContainerStarted","Data":"1aff2026c136f53d10f64365b2aad86174bcc6a1ee325be9311b024c9797c7c4"} Oct 01 13:50:19 crc kubenswrapper[4749]: I1001 13:50:19.476485 4749 generic.go:334] "Generic (PLEG): container finished" podID="036a07d0-48a5-4b25-9b37-0a854960006d" containerID="840f2bcc102e3aecf4ca28fd70cc65b1fcc1ca31795e3bb9ace1aec664a2c9e8" exitCode=0 Oct 01 13:50:19 crc kubenswrapper[4749]: I1001 13:50:19.476614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs4wc" event={"ID":"036a07d0-48a5-4b25-9b37-0a854960006d","Type":"ContainerDied","Data":"840f2bcc102e3aecf4ca28fd70cc65b1fcc1ca31795e3bb9ace1aec664a2c9e8"} Oct 01 13:50:20 crc kubenswrapper[4749]: I1001 13:50:20.496128 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs4wc" event={"ID":"036a07d0-48a5-4b25-9b37-0a854960006d","Type":"ContainerStarted","Data":"b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb"} Oct 01 13:50:20 crc kubenswrapper[4749]: I1001 13:50:20.527103 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gs4wc" podStartSLOduration=2.855188455 podStartE2EDuration="5.527087082s" podCreationTimestamp="2025-10-01 13:50:15 +0000 UTC" firstStartedPulling="2025-10-01 13:50:17.457667209 +0000 UTC m=+2677.511652108" lastFinishedPulling="2025-10-01 13:50:20.129565836 +0000 UTC m=+2680.183550735" observedRunningTime="2025-10-01 13:50:20.520212345 +0000 UTC m=+2680.574197294" watchObservedRunningTime="2025-10-01 13:50:20.527087082 +0000 UTC m=+2680.581071971" Oct 01 13:50:26 crc kubenswrapper[4749]: I1001 13:50:26.203088 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:26 crc kubenswrapper[4749]: I1001 13:50:26.203773 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:26 crc kubenswrapper[4749]: I1001 13:50:26.246818 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:26 crc kubenswrapper[4749]: I1001 13:50:26.612613 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:26 crc kubenswrapper[4749]: I1001 13:50:26.658893 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gs4wc"] Oct 01 13:50:28 crc kubenswrapper[4749]: I1001 13:50:28.583931 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gs4wc" podUID="036a07d0-48a5-4b25-9b37-0a854960006d" containerName="registry-server" containerID="cri-o://b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb" gracePeriod=2 Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.052504 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.150604 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4gf2\" (UniqueName: \"kubernetes.io/projected/036a07d0-48a5-4b25-9b37-0a854960006d-kube-api-access-k4gf2\") pod \"036a07d0-48a5-4b25-9b37-0a854960006d\" (UID: \"036a07d0-48a5-4b25-9b37-0a854960006d\") " Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.150653 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/036a07d0-48a5-4b25-9b37-0a854960006d-utilities\") pod \"036a07d0-48a5-4b25-9b37-0a854960006d\" (UID: \"036a07d0-48a5-4b25-9b37-0a854960006d\") " Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.150807 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/036a07d0-48a5-4b25-9b37-0a854960006d-catalog-content\") pod \"036a07d0-48a5-4b25-9b37-0a854960006d\" (UID: \"036a07d0-48a5-4b25-9b37-0a854960006d\") " Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.151537 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036a07d0-48a5-4b25-9b37-0a854960006d-utilities" (OuterVolumeSpecName: "utilities") pod "036a07d0-48a5-4b25-9b37-0a854960006d" (UID: "036a07d0-48a5-4b25-9b37-0a854960006d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.155687 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/036a07d0-48a5-4b25-9b37-0a854960006d-kube-api-access-k4gf2" (OuterVolumeSpecName: "kube-api-access-k4gf2") pod "036a07d0-48a5-4b25-9b37-0a854960006d" (UID: "036a07d0-48a5-4b25-9b37-0a854960006d"). InnerVolumeSpecName "kube-api-access-k4gf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.253124 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/036a07d0-48a5-4b25-9b37-0a854960006d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.253166 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4gf2\" (UniqueName: \"kubernetes.io/projected/036a07d0-48a5-4b25-9b37-0a854960006d-kube-api-access-k4gf2\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.256622 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036a07d0-48a5-4b25-9b37-0a854960006d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "036a07d0-48a5-4b25-9b37-0a854960006d" (UID: "036a07d0-48a5-4b25-9b37-0a854960006d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.355624 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/036a07d0-48a5-4b25-9b37-0a854960006d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.595888 4749 generic.go:334] "Generic (PLEG): container finished" podID="036a07d0-48a5-4b25-9b37-0a854960006d" containerID="b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb" exitCode=0 Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.595942 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs4wc" event={"ID":"036a07d0-48a5-4b25-9b37-0a854960006d","Type":"ContainerDied","Data":"b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb"} Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.595981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gs4wc" event={"ID":"036a07d0-48a5-4b25-9b37-0a854960006d","Type":"ContainerDied","Data":"1aff2026c136f53d10f64365b2aad86174bcc6a1ee325be9311b024c9797c7c4"} Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.596004 4749 scope.go:117] "RemoveContainer" containerID="b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.595953 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gs4wc" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.625911 4749 scope.go:117] "RemoveContainer" containerID="840f2bcc102e3aecf4ca28fd70cc65b1fcc1ca31795e3bb9ace1aec664a2c9e8" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.640500 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gs4wc"] Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.648091 4749 scope.go:117] "RemoveContainer" containerID="010ce65e636d46e8e8377f931a1e719b1576e5ad0bf0443f3203ab16066ca321" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.656084 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gs4wc"] Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.697930 4749 scope.go:117] "RemoveContainer" containerID="b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb" Oct 01 13:50:29 crc kubenswrapper[4749]: E1001 13:50:29.698381 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb\": container with ID starting with b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb not found: ID does not exist" containerID="b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.698417 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb"} err="failed to get container status \"b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb\": rpc error: code = NotFound desc = could not find container \"b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb\": container with ID starting with b9ad52e7143945146a18b13ee4a5e1053c76c6942b1dd3ed1215862f9f4e91fb not found: ID does not exist" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.698442 4749 scope.go:117] "RemoveContainer" containerID="840f2bcc102e3aecf4ca28fd70cc65b1fcc1ca31795e3bb9ace1aec664a2c9e8" Oct 01 13:50:29 crc kubenswrapper[4749]: E1001 13:50:29.699046 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"840f2bcc102e3aecf4ca28fd70cc65b1fcc1ca31795e3bb9ace1aec664a2c9e8\": container with ID starting with 840f2bcc102e3aecf4ca28fd70cc65b1fcc1ca31795e3bb9ace1aec664a2c9e8 not found: ID does not exist" containerID="840f2bcc102e3aecf4ca28fd70cc65b1fcc1ca31795e3bb9ace1aec664a2c9e8" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.699070 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840f2bcc102e3aecf4ca28fd70cc65b1fcc1ca31795e3bb9ace1aec664a2c9e8"} err="failed to get container status \"840f2bcc102e3aecf4ca28fd70cc65b1fcc1ca31795e3bb9ace1aec664a2c9e8\": rpc error: code = NotFound desc = could not find container \"840f2bcc102e3aecf4ca28fd70cc65b1fcc1ca31795e3bb9ace1aec664a2c9e8\": container with ID starting with 840f2bcc102e3aecf4ca28fd70cc65b1fcc1ca31795e3bb9ace1aec664a2c9e8 not found: ID does not exist" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.699087 4749 scope.go:117] "RemoveContainer" containerID="010ce65e636d46e8e8377f931a1e719b1576e5ad0bf0443f3203ab16066ca321" Oct 01 13:50:29 crc kubenswrapper[4749]: E1001 13:50:29.699485 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"010ce65e636d46e8e8377f931a1e719b1576e5ad0bf0443f3203ab16066ca321\": container with ID starting with 010ce65e636d46e8e8377f931a1e719b1576e5ad0bf0443f3203ab16066ca321 not found: ID does not exist" containerID="010ce65e636d46e8e8377f931a1e719b1576e5ad0bf0443f3203ab16066ca321" Oct 01 13:50:29 crc kubenswrapper[4749]: I1001 13:50:29.699535 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"010ce65e636d46e8e8377f931a1e719b1576e5ad0bf0443f3203ab16066ca321"} err="failed to get container status \"010ce65e636d46e8e8377f931a1e719b1576e5ad0bf0443f3203ab16066ca321\": rpc error: code = NotFound desc = could not find container \"010ce65e636d46e8e8377f931a1e719b1576e5ad0bf0443f3203ab16066ca321\": container with ID starting with 010ce65e636d46e8e8377f931a1e719b1576e5ad0bf0443f3203ab16066ca321 not found: ID does not exist" Oct 01 13:50:31 crc kubenswrapper[4749]: I1001 13:50:31.248092 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="036a07d0-48a5-4b25-9b37-0a854960006d" path="/var/lib/kubelet/pods/036a07d0-48a5-4b25-9b37-0a854960006d/volumes" Oct 01 13:50:32 crc kubenswrapper[4749]: I1001 13:50:32.106194 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:50:32 crc kubenswrapper[4749]: I1001 13:50:32.106585 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:51:02 crc kubenswrapper[4749]: I1001 13:51:02.105979 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:51:02 crc kubenswrapper[4749]: I1001 13:51:02.106604 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:51:02 crc kubenswrapper[4749]: I1001 13:51:02.106652 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:51:02 crc kubenswrapper[4749]: I1001 13:51:02.107429 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42dfa798e693af341ed4d4d0c7dda7977e057acf005434caadc63a2b7f7ef228"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:51:02 crc kubenswrapper[4749]: I1001 13:51:02.107491 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://42dfa798e693af341ed4d4d0c7dda7977e057acf005434caadc63a2b7f7ef228" gracePeriod=600 Oct 01 13:51:02 crc kubenswrapper[4749]: I1001 13:51:02.967029 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="42dfa798e693af341ed4d4d0c7dda7977e057acf005434caadc63a2b7f7ef228" exitCode=0 Oct 01 13:51:02 crc kubenswrapper[4749]: I1001 13:51:02.967107 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"42dfa798e693af341ed4d4d0c7dda7977e057acf005434caadc63a2b7f7ef228"} Oct 01 13:51:02 crc kubenswrapper[4749]: I1001 13:51:02.968180 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082"} Oct 01 13:51:02 crc kubenswrapper[4749]: I1001 13:51:02.968246 4749 scope.go:117] "RemoveContainer" containerID="e6cf2cf510c6e1ed87537c72d814669e36485cf78d9c744a24b6a939a789425f" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.447903 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hk7bs"] Oct 01 13:51:44 crc kubenswrapper[4749]: E1001 13:51:44.449276 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036a07d0-48a5-4b25-9b37-0a854960006d" containerName="extract-content" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.449293 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="036a07d0-48a5-4b25-9b37-0a854960006d" containerName="extract-content" Oct 01 13:51:44 crc kubenswrapper[4749]: E1001 13:51:44.449318 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036a07d0-48a5-4b25-9b37-0a854960006d" containerName="registry-server" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.449326 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="036a07d0-48a5-4b25-9b37-0a854960006d" containerName="registry-server" Oct 01 13:51:44 crc kubenswrapper[4749]: E1001 13:51:44.449350 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="036a07d0-48a5-4b25-9b37-0a854960006d" containerName="extract-utilities" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.449357 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="036a07d0-48a5-4b25-9b37-0a854960006d" containerName="extract-utilities" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.449627 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="036a07d0-48a5-4b25-9b37-0a854960006d" containerName="registry-server" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.451413 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.463380 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hk7bs"] Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.475997 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7j79\" (UniqueName: \"kubernetes.io/projected/dc9e60b6-d826-4dd4-977e-7ba492799b69-kube-api-access-q7j79\") pod \"certified-operators-hk7bs\" (UID: \"dc9e60b6-d826-4dd4-977e-7ba492799b69\") " pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.476435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9e60b6-d826-4dd4-977e-7ba492799b69-catalog-content\") pod \"certified-operators-hk7bs\" (UID: \"dc9e60b6-d826-4dd4-977e-7ba492799b69\") " pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.476512 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9e60b6-d826-4dd4-977e-7ba492799b69-utilities\") pod \"certified-operators-hk7bs\" (UID: \"dc9e60b6-d826-4dd4-977e-7ba492799b69\") " pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.577320 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9e60b6-d826-4dd4-977e-7ba492799b69-catalog-content\") pod \"certified-operators-hk7bs\" (UID: \"dc9e60b6-d826-4dd4-977e-7ba492799b69\") " pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.577407 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9e60b6-d826-4dd4-977e-7ba492799b69-utilities\") pod \"certified-operators-hk7bs\" (UID: \"dc9e60b6-d826-4dd4-977e-7ba492799b69\") " pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.577508 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7j79\" (UniqueName: \"kubernetes.io/projected/dc9e60b6-d826-4dd4-977e-7ba492799b69-kube-api-access-q7j79\") pod \"certified-operators-hk7bs\" (UID: \"dc9e60b6-d826-4dd4-977e-7ba492799b69\") " pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.578316 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9e60b6-d826-4dd4-977e-7ba492799b69-catalog-content\") pod \"certified-operators-hk7bs\" (UID: \"dc9e60b6-d826-4dd4-977e-7ba492799b69\") " pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.578622 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9e60b6-d826-4dd4-977e-7ba492799b69-utilities\") pod \"certified-operators-hk7bs\" (UID: \"dc9e60b6-d826-4dd4-977e-7ba492799b69\") " pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.599545 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7j79\" (UniqueName: \"kubernetes.io/projected/dc9e60b6-d826-4dd4-977e-7ba492799b69-kube-api-access-q7j79\") pod \"certified-operators-hk7bs\" (UID: \"dc9e60b6-d826-4dd4-977e-7ba492799b69\") " pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:44 crc kubenswrapper[4749]: I1001 13:51:44.771794 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:45 crc kubenswrapper[4749]: I1001 13:51:45.287161 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hk7bs"] Oct 01 13:51:45 crc kubenswrapper[4749]: I1001 13:51:45.471941 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hk7bs" event={"ID":"dc9e60b6-d826-4dd4-977e-7ba492799b69","Type":"ContainerStarted","Data":"490b9898d97495949f34facd291af76f459043e67ed461269610d6e6e34c19b5"} Oct 01 13:51:46 crc kubenswrapper[4749]: I1001 13:51:46.487458 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc9e60b6-d826-4dd4-977e-7ba492799b69" containerID="bc283e6f2ba1503c84952c8baca6443011dc232f7733bfebcfa84e976acc1a47" exitCode=0 Oct 01 13:51:46 crc kubenswrapper[4749]: I1001 13:51:46.487577 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hk7bs" event={"ID":"dc9e60b6-d826-4dd4-977e-7ba492799b69","Type":"ContainerDied","Data":"bc283e6f2ba1503c84952c8baca6443011dc232f7733bfebcfa84e976acc1a47"} Oct 01 13:51:48 crc kubenswrapper[4749]: I1001 13:51:48.509035 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc9e60b6-d826-4dd4-977e-7ba492799b69" containerID="f2ea79ad567dca5a218e8ab6fd05f5a823d6d96e0be0f508469d303ae6263f53" exitCode=0 Oct 01 13:51:48 crc kubenswrapper[4749]: I1001 13:51:48.509096 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hk7bs" event={"ID":"dc9e60b6-d826-4dd4-977e-7ba492799b69","Type":"ContainerDied","Data":"f2ea79ad567dca5a218e8ab6fd05f5a823d6d96e0be0f508469d303ae6263f53"} Oct 01 13:51:49 crc kubenswrapper[4749]: I1001 13:51:49.522563 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hk7bs" event={"ID":"dc9e60b6-d826-4dd4-977e-7ba492799b69","Type":"ContainerStarted","Data":"395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487"} Oct 01 13:51:49 crc kubenswrapper[4749]: I1001 13:51:49.553290 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hk7bs" podStartSLOduration=3.031451821 podStartE2EDuration="5.553264655s" podCreationTimestamp="2025-10-01 13:51:44 +0000 UTC" firstStartedPulling="2025-10-01 13:51:46.489896336 +0000 UTC m=+2766.543881255" lastFinishedPulling="2025-10-01 13:51:49.01170919 +0000 UTC m=+2769.065694089" observedRunningTime="2025-10-01 13:51:49.551125204 +0000 UTC m=+2769.605110103" watchObservedRunningTime="2025-10-01 13:51:49.553264655 +0000 UTC m=+2769.607249594" Oct 01 13:51:54 crc kubenswrapper[4749]: I1001 13:51:54.772544 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:54 crc kubenswrapper[4749]: I1001 13:51:54.773132 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:54 crc kubenswrapper[4749]: I1001 13:51:54.819074 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:55 crc kubenswrapper[4749]: I1001 13:51:55.667263 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:55 crc kubenswrapper[4749]: I1001 13:51:55.724836 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hk7bs"] Oct 01 13:51:57 crc kubenswrapper[4749]: I1001 13:51:57.611806 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hk7bs" podUID="dc9e60b6-d826-4dd4-977e-7ba492799b69" containerName="registry-server" containerID="cri-o://395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487" gracePeriod=2 Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.126895 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.262995 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9e60b6-d826-4dd4-977e-7ba492799b69-utilities\") pod \"dc9e60b6-d826-4dd4-977e-7ba492799b69\" (UID: \"dc9e60b6-d826-4dd4-977e-7ba492799b69\") " Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.263089 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9e60b6-d826-4dd4-977e-7ba492799b69-catalog-content\") pod \"dc9e60b6-d826-4dd4-977e-7ba492799b69\" (UID: \"dc9e60b6-d826-4dd4-977e-7ba492799b69\") " Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.263249 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7j79\" (UniqueName: \"kubernetes.io/projected/dc9e60b6-d826-4dd4-977e-7ba492799b69-kube-api-access-q7j79\") pod \"dc9e60b6-d826-4dd4-977e-7ba492799b69\" (UID: \"dc9e60b6-d826-4dd4-977e-7ba492799b69\") " Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.264700 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc9e60b6-d826-4dd4-977e-7ba492799b69-utilities" (OuterVolumeSpecName: "utilities") pod "dc9e60b6-d826-4dd4-977e-7ba492799b69" (UID: "dc9e60b6-d826-4dd4-977e-7ba492799b69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.272442 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc9e60b6-d826-4dd4-977e-7ba492799b69-kube-api-access-q7j79" (OuterVolumeSpecName: "kube-api-access-q7j79") pod "dc9e60b6-d826-4dd4-977e-7ba492799b69" (UID: "dc9e60b6-d826-4dd4-977e-7ba492799b69"). InnerVolumeSpecName "kube-api-access-q7j79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.306102 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc9e60b6-d826-4dd4-977e-7ba492799b69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc9e60b6-d826-4dd4-977e-7ba492799b69" (UID: "dc9e60b6-d826-4dd4-977e-7ba492799b69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.366058 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc9e60b6-d826-4dd4-977e-7ba492799b69-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.366475 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7j79\" (UniqueName: \"kubernetes.io/projected/dc9e60b6-d826-4dd4-977e-7ba492799b69-kube-api-access-q7j79\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.366534 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc9e60b6-d826-4dd4-977e-7ba492799b69-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.624482 4749 generic.go:334] "Generic (PLEG): container finished" podID="dc9e60b6-d826-4dd4-977e-7ba492799b69" containerID="395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487" exitCode=0 Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.624544 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hk7bs" event={"ID":"dc9e60b6-d826-4dd4-977e-7ba492799b69","Type":"ContainerDied","Data":"395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487"} Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.624585 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hk7bs" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.624612 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hk7bs" event={"ID":"dc9e60b6-d826-4dd4-977e-7ba492799b69","Type":"ContainerDied","Data":"490b9898d97495949f34facd291af76f459043e67ed461269610d6e6e34c19b5"} Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.624678 4749 scope.go:117] "RemoveContainer" containerID="395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.647452 4749 scope.go:117] "RemoveContainer" containerID="f2ea79ad567dca5a218e8ab6fd05f5a823d6d96e0be0f508469d303ae6263f53" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.674376 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hk7bs"] Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.690825 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hk7bs"] Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.703950 4749 scope.go:117] "RemoveContainer" containerID="bc283e6f2ba1503c84952c8baca6443011dc232f7733bfebcfa84e976acc1a47" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.735853 4749 scope.go:117] "RemoveContainer" containerID="395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487" Oct 01 13:51:58 crc kubenswrapper[4749]: E1001 13:51:58.736306 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487\": container with ID starting with 395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487 not found: ID does not exist" containerID="395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.736345 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487"} err="failed to get container status \"395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487\": rpc error: code = NotFound desc = could not find container \"395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487\": container with ID starting with 395fd3e62ef268b92ad1d40be1fb1e70e58ac43df3495310ae56e265f7549487 not found: ID does not exist" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.736366 4749 scope.go:117] "RemoveContainer" containerID="f2ea79ad567dca5a218e8ab6fd05f5a823d6d96e0be0f508469d303ae6263f53" Oct 01 13:51:58 crc kubenswrapper[4749]: E1001 13:51:58.736600 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2ea79ad567dca5a218e8ab6fd05f5a823d6d96e0be0f508469d303ae6263f53\": container with ID starting with f2ea79ad567dca5a218e8ab6fd05f5a823d6d96e0be0f508469d303ae6263f53 not found: ID does not exist" containerID="f2ea79ad567dca5a218e8ab6fd05f5a823d6d96e0be0f508469d303ae6263f53" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.736631 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ea79ad567dca5a218e8ab6fd05f5a823d6d96e0be0f508469d303ae6263f53"} err="failed to get container status \"f2ea79ad567dca5a218e8ab6fd05f5a823d6d96e0be0f508469d303ae6263f53\": rpc error: code = NotFound desc = could not find container \"f2ea79ad567dca5a218e8ab6fd05f5a823d6d96e0be0f508469d303ae6263f53\": container with ID starting with f2ea79ad567dca5a218e8ab6fd05f5a823d6d96e0be0f508469d303ae6263f53 not found: ID does not exist" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.736652 4749 scope.go:117] "RemoveContainer" containerID="bc283e6f2ba1503c84952c8baca6443011dc232f7733bfebcfa84e976acc1a47" Oct 01 13:51:58 crc kubenswrapper[4749]: E1001 13:51:58.736864 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc283e6f2ba1503c84952c8baca6443011dc232f7733bfebcfa84e976acc1a47\": container with ID starting with bc283e6f2ba1503c84952c8baca6443011dc232f7733bfebcfa84e976acc1a47 not found: ID does not exist" containerID="bc283e6f2ba1503c84952c8baca6443011dc232f7733bfebcfa84e976acc1a47" Oct 01 13:51:58 crc kubenswrapper[4749]: I1001 13:51:58.736884 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc283e6f2ba1503c84952c8baca6443011dc232f7733bfebcfa84e976acc1a47"} err="failed to get container status \"bc283e6f2ba1503c84952c8baca6443011dc232f7733bfebcfa84e976acc1a47\": rpc error: code = NotFound desc = could not find container \"bc283e6f2ba1503c84952c8baca6443011dc232f7733bfebcfa84e976acc1a47\": container with ID starting with bc283e6f2ba1503c84952c8baca6443011dc232f7733bfebcfa84e976acc1a47 not found: ID does not exist" Oct 01 13:51:59 crc kubenswrapper[4749]: I1001 13:51:59.248999 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc9e60b6-d826-4dd4-977e-7ba492799b69" path="/var/lib/kubelet/pods/dc9e60b6-d826-4dd4-977e-7ba492799b69/volumes" Oct 01 13:52:59 crc kubenswrapper[4749]: I1001 13:52:59.256650 4749 generic.go:334] "Generic (PLEG): container finished" podID="848e191d-2e82-41af-8368-7c9c7e7b200e" containerID="7b65e74784f9ddc6d2e8edf65387f26bd369be066ef50235fcdad5d687f2983f" exitCode=0 Oct 01 13:52:59 crc kubenswrapper[4749]: I1001 13:52:59.256752 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" event={"ID":"848e191d-2e82-41af-8368-7c9c7e7b200e","Type":"ContainerDied","Data":"7b65e74784f9ddc6d2e8edf65387f26bd369be066ef50235fcdad5d687f2983f"} Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.786044 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.818158 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-cell1-compute-config-0\") pod \"848e191d-2e82-41af-8368-7c9c7e7b200e\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.818341 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-combined-ca-bundle\") pod \"848e191d-2e82-41af-8368-7c9c7e7b200e\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.818404 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-migration-ssh-key-1\") pod \"848e191d-2e82-41af-8368-7c9c7e7b200e\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.818626 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-inventory\") pod \"848e191d-2e82-41af-8368-7c9c7e7b200e\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.818676 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-extra-config-0\") pod \"848e191d-2e82-41af-8368-7c9c7e7b200e\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.818732 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-ssh-key\") pod \"848e191d-2e82-41af-8368-7c9c7e7b200e\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.818829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-cell1-compute-config-1\") pod \"848e191d-2e82-41af-8368-7c9c7e7b200e\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.818869 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-migration-ssh-key-0\") pod \"848e191d-2e82-41af-8368-7c9c7e7b200e\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.818942 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk9lq\" (UniqueName: \"kubernetes.io/projected/848e191d-2e82-41af-8368-7c9c7e7b200e-kube-api-access-vk9lq\") pod \"848e191d-2e82-41af-8368-7c9c7e7b200e\" (UID: \"848e191d-2e82-41af-8368-7c9c7e7b200e\") " Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.832910 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848e191d-2e82-41af-8368-7c9c7e7b200e-kube-api-access-vk9lq" (OuterVolumeSpecName: "kube-api-access-vk9lq") pod "848e191d-2e82-41af-8368-7c9c7e7b200e" (UID: "848e191d-2e82-41af-8368-7c9c7e7b200e"). InnerVolumeSpecName "kube-api-access-vk9lq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.852943 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "848e191d-2e82-41af-8368-7c9c7e7b200e" (UID: "848e191d-2e82-41af-8368-7c9c7e7b200e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.867650 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "848e191d-2e82-41af-8368-7c9c7e7b200e" (UID: "848e191d-2e82-41af-8368-7c9c7e7b200e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.886582 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "848e191d-2e82-41af-8368-7c9c7e7b200e" (UID: "848e191d-2e82-41af-8368-7c9c7e7b200e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.893832 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-inventory" (OuterVolumeSpecName: "inventory") pod "848e191d-2e82-41af-8368-7c9c7e7b200e" (UID: "848e191d-2e82-41af-8368-7c9c7e7b200e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.894929 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "848e191d-2e82-41af-8368-7c9c7e7b200e" (UID: "848e191d-2e82-41af-8368-7c9c7e7b200e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.897114 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "848e191d-2e82-41af-8368-7c9c7e7b200e" (UID: "848e191d-2e82-41af-8368-7c9c7e7b200e"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.905692 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "848e191d-2e82-41af-8368-7c9c7e7b200e" (UID: "848e191d-2e82-41af-8368-7c9c7e7b200e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.910535 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "848e191d-2e82-41af-8368-7c9c7e7b200e" (UID: "848e191d-2e82-41af-8368-7c9c7e7b200e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.921837 4749 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.921887 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.921903 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.921920 4749 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.921935 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk9lq\" (UniqueName: \"kubernetes.io/projected/848e191d-2e82-41af-8368-7c9c7e7b200e-kube-api-access-vk9lq\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.921949 4749 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.921962 4749 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.921976 4749 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:00 crc kubenswrapper[4749]: I1001 13:53:00.921992 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/848e191d-2e82-41af-8368-7c9c7e7b200e-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.276040 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" event={"ID":"848e191d-2e82-41af-8368-7c9c7e7b200e","Type":"ContainerDied","Data":"7995f5e1d991a65e53e62ddb16eb36ad557c0107c9d7068e2f4f9d1599ea4406"} Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.276110 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7995f5e1d991a65e53e62ddb16eb36ad557c0107c9d7068e2f4f9d1599ea4406" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.276127 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-n4plg" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.377812 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6"] Oct 01 13:53:01 crc kubenswrapper[4749]: E1001 13:53:01.378270 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9e60b6-d826-4dd4-977e-7ba492799b69" containerName="extract-utilities" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.378292 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9e60b6-d826-4dd4-977e-7ba492799b69" containerName="extract-utilities" Oct 01 13:53:01 crc kubenswrapper[4749]: E1001 13:53:01.378315 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9e60b6-d826-4dd4-977e-7ba492799b69" containerName="extract-content" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.378323 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9e60b6-d826-4dd4-977e-7ba492799b69" containerName="extract-content" Oct 01 13:53:01 crc kubenswrapper[4749]: E1001 13:53:01.378344 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc9e60b6-d826-4dd4-977e-7ba492799b69" containerName="registry-server" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.378353 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc9e60b6-d826-4dd4-977e-7ba492799b69" containerName="registry-server" Oct 01 13:53:01 crc kubenswrapper[4749]: E1001 13:53:01.378370 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848e191d-2e82-41af-8368-7c9c7e7b200e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.378379 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="848e191d-2e82-41af-8368-7c9c7e7b200e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.378608 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="848e191d-2e82-41af-8368-7c9c7e7b200e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.378630 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc9e60b6-d826-4dd4-977e-7ba492799b69" containerName="registry-server" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.379487 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.383709 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.385291 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.387098 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d8vpn" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.387384 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.390929 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.404919 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6"] Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.433910 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.433963 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.433994 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7cxk\" (UniqueName: \"kubernetes.io/projected/a74a77b0-6409-400a-a75c-115e2b2cba85-kube-api-access-m7cxk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.434041 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.434067 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.434259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.434317 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.536899 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.536979 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.537025 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7cxk\" (UniqueName: \"kubernetes.io/projected/a74a77b0-6409-400a-a75c-115e2b2cba85-kube-api-access-m7cxk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.537073 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.537117 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.537362 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.537424 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.543006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.543077 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.544038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.544389 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.545510 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.546653 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.564039 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7cxk\" (UniqueName: \"kubernetes.io/projected/a74a77b0-6409-400a-a75c-115e2b2cba85-kube-api-access-m7cxk\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:01 crc kubenswrapper[4749]: I1001 13:53:01.710926 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:53:02 crc kubenswrapper[4749]: I1001 13:53:02.106346 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:53:02 crc kubenswrapper[4749]: I1001 13:53:02.106708 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:53:02 crc kubenswrapper[4749]: I1001 13:53:02.278650 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6"] Oct 01 13:53:02 crc kubenswrapper[4749]: W1001 13:53:02.287281 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda74a77b0_6409_400a_a75c_115e2b2cba85.slice/crio-9f4b707f3ccd9575c5ffa90925c06f91510444cee15eafbba64a6537cb6185cd WatchSource:0}: Error finding container 9f4b707f3ccd9575c5ffa90925c06f91510444cee15eafbba64a6537cb6185cd: Status 404 returned error can't find the container with id 9f4b707f3ccd9575c5ffa90925c06f91510444cee15eafbba64a6537cb6185cd Oct 01 13:53:03 crc kubenswrapper[4749]: I1001 13:53:03.295661 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" event={"ID":"a74a77b0-6409-400a-a75c-115e2b2cba85","Type":"ContainerStarted","Data":"f7d3b7d70a7da5bb398996210d9869e776c0fdee3a6fd7d542c0acb57fd5f99e"} Oct 01 13:53:03 crc kubenswrapper[4749]: I1001 13:53:03.296117 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" event={"ID":"a74a77b0-6409-400a-a75c-115e2b2cba85","Type":"ContainerStarted","Data":"9f4b707f3ccd9575c5ffa90925c06f91510444cee15eafbba64a6537cb6185cd"} Oct 01 13:53:03 crc kubenswrapper[4749]: I1001 13:53:03.323362 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" podStartSLOduration=1.848909965 podStartE2EDuration="2.323338766s" podCreationTimestamp="2025-10-01 13:53:01 +0000 UTC" firstStartedPulling="2025-10-01 13:53:02.290440765 +0000 UTC m=+2842.344425664" lastFinishedPulling="2025-10-01 13:53:02.764869536 +0000 UTC m=+2842.818854465" observedRunningTime="2025-10-01 13:53:03.314632366 +0000 UTC m=+2843.368617275" watchObservedRunningTime="2025-10-01 13:53:03.323338766 +0000 UTC m=+2843.377323675" Oct 01 13:53:32 crc kubenswrapper[4749]: I1001 13:53:32.106348 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:53:32 crc kubenswrapper[4749]: I1001 13:53:32.106915 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:54:02 crc kubenswrapper[4749]: I1001 13:54:02.106392 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 13:54:02 crc kubenswrapper[4749]: I1001 13:54:02.107155 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 13:54:02 crc kubenswrapper[4749]: I1001 13:54:02.107274 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 13:54:02 crc kubenswrapper[4749]: I1001 13:54:02.108535 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 13:54:02 crc kubenswrapper[4749]: I1001 13:54:02.108674 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" gracePeriod=600 Oct 01 13:54:02 crc kubenswrapper[4749]: E1001 13:54:02.229738 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:54:02 crc kubenswrapper[4749]: I1001 13:54:02.975593 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" exitCode=0 Oct 01 13:54:02 crc kubenswrapper[4749]: I1001 13:54:02.975702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082"} Oct 01 13:54:02 crc kubenswrapper[4749]: I1001 13:54:02.975955 4749 scope.go:117] "RemoveContainer" containerID="42dfa798e693af341ed4d4d0c7dda7977e057acf005434caadc63a2b7f7ef228" Oct 01 13:54:02 crc kubenswrapper[4749]: I1001 13:54:02.976545 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:54:02 crc kubenswrapper[4749]: E1001 13:54:02.976807 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:54:18 crc kubenswrapper[4749]: I1001 13:54:18.230825 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:54:18 crc kubenswrapper[4749]: E1001 13:54:18.233101 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:54:30 crc kubenswrapper[4749]: I1001 13:54:30.229879 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:54:30 crc kubenswrapper[4749]: E1001 13:54:30.230841 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:54:41 crc kubenswrapper[4749]: I1001 13:54:41.246740 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:54:41 crc kubenswrapper[4749]: E1001 13:54:41.248310 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:54:53 crc kubenswrapper[4749]: I1001 13:54:53.230271 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:54:53 crc kubenswrapper[4749]: E1001 13:54:53.231178 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:55:04 crc kubenswrapper[4749]: I1001 13:55:04.230498 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:55:04 crc kubenswrapper[4749]: E1001 13:55:04.231496 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.266111 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hz9j8"] Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.268508 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.277603 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hz9j8"] Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.454621 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvrzt\" (UniqueName: \"kubernetes.io/projected/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-kube-api-access-wvrzt\") pod \"redhat-marketplace-hz9j8\" (UID: \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\") " pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.454783 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-utilities\") pod \"redhat-marketplace-hz9j8\" (UID: \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\") " pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.454935 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-catalog-content\") pod \"redhat-marketplace-hz9j8\" (UID: \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\") " pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.556505 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-utilities\") pod \"redhat-marketplace-hz9j8\" (UID: \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\") " pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.556647 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-catalog-content\") pod \"redhat-marketplace-hz9j8\" (UID: \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\") " pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.556705 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvrzt\" (UniqueName: \"kubernetes.io/projected/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-kube-api-access-wvrzt\") pod \"redhat-marketplace-hz9j8\" (UID: \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\") " pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.557146 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-utilities\") pod \"redhat-marketplace-hz9j8\" (UID: \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\") " pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.557167 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-catalog-content\") pod \"redhat-marketplace-hz9j8\" (UID: \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\") " pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.577860 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvrzt\" (UniqueName: \"kubernetes.io/projected/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-kube-api-access-wvrzt\") pod \"redhat-marketplace-hz9j8\" (UID: \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\") " pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:11 crc kubenswrapper[4749]: I1001 13:55:11.601089 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:12 crc kubenswrapper[4749]: I1001 13:55:12.056655 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hz9j8"] Oct 01 13:55:12 crc kubenswrapper[4749]: I1001 13:55:12.752122 4749 generic.go:334] "Generic (PLEG): container finished" podID="6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" containerID="3934a4653b6d236195c3f57e99dd9d4e368ce56940b5376eedcf5249a2b3afe3" exitCode=0 Oct 01 13:55:12 crc kubenswrapper[4749]: I1001 13:55:12.752199 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hz9j8" event={"ID":"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d","Type":"ContainerDied","Data":"3934a4653b6d236195c3f57e99dd9d4e368ce56940b5376eedcf5249a2b3afe3"} Oct 01 13:55:12 crc kubenswrapper[4749]: I1001 13:55:12.752459 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hz9j8" event={"ID":"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d","Type":"ContainerStarted","Data":"81954b44160af45d58a61cb83ad43504d86131906f087e799a316d9c135f7d72"} Oct 01 13:55:12 crc kubenswrapper[4749]: I1001 13:55:12.756442 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 13:55:13 crc kubenswrapper[4749]: I1001 13:55:13.763613 4749 generic.go:334] "Generic (PLEG): container finished" podID="6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" containerID="5c995a63dbe043dfe792797bb0a42cef038c3bb93a6df8baa0cc093b443d5bd3" exitCode=0 Oct 01 13:55:13 crc kubenswrapper[4749]: I1001 13:55:13.763664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hz9j8" event={"ID":"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d","Type":"ContainerDied","Data":"5c995a63dbe043dfe792797bb0a42cef038c3bb93a6df8baa0cc093b443d5bd3"} Oct 01 13:55:14 crc kubenswrapper[4749]: I1001 13:55:14.778069 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hz9j8" event={"ID":"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d","Type":"ContainerStarted","Data":"40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd"} Oct 01 13:55:14 crc kubenswrapper[4749]: I1001 13:55:14.808352 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hz9j8" podStartSLOduration=2.386743384 podStartE2EDuration="3.808335498s" podCreationTimestamp="2025-10-01 13:55:11 +0000 UTC" firstStartedPulling="2025-10-01 13:55:12.756177658 +0000 UTC m=+2972.810162557" lastFinishedPulling="2025-10-01 13:55:14.177769752 +0000 UTC m=+2974.231754671" observedRunningTime="2025-10-01 13:55:14.79939513 +0000 UTC m=+2974.853380029" watchObservedRunningTime="2025-10-01 13:55:14.808335498 +0000 UTC m=+2974.862320397" Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.231067 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:55:16 crc kubenswrapper[4749]: E1001 13:55:16.231879 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.441478 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x6qvg"] Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.444475 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.451814 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x6qvg"] Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.471208 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29gqb\" (UniqueName: \"kubernetes.io/projected/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-kube-api-access-29gqb\") pod \"community-operators-x6qvg\" (UID: \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\") " pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.471481 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-utilities\") pod \"community-operators-x6qvg\" (UID: \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\") " pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.474278 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-catalog-content\") pod \"community-operators-x6qvg\" (UID: \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\") " pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.576582 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29gqb\" (UniqueName: \"kubernetes.io/projected/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-kube-api-access-29gqb\") pod \"community-operators-x6qvg\" (UID: \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\") " pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.576664 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-utilities\") pod \"community-operators-x6qvg\" (UID: \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\") " pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.576734 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-catalog-content\") pod \"community-operators-x6qvg\" (UID: \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\") " pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.577569 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-utilities\") pod \"community-operators-x6qvg\" (UID: \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\") " pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.578696 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-catalog-content\") pod \"community-operators-x6qvg\" (UID: \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\") " pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.603099 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29gqb\" (UniqueName: \"kubernetes.io/projected/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-kube-api-access-29gqb\") pod \"community-operators-x6qvg\" (UID: \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\") " pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:16 crc kubenswrapper[4749]: I1001 13:55:16.796037 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:17 crc kubenswrapper[4749]: I1001 13:55:17.317241 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x6qvg"] Oct 01 13:55:17 crc kubenswrapper[4749]: I1001 13:55:17.809742 4749 generic.go:334] "Generic (PLEG): container finished" podID="2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" containerID="3ae7ee96ffd69728cd6c81ad8e7f1331c52a731ed33c301c91b4631918da6524" exitCode=0 Oct 01 13:55:17 crc kubenswrapper[4749]: I1001 13:55:17.809815 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6qvg" event={"ID":"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7","Type":"ContainerDied","Data":"3ae7ee96ffd69728cd6c81ad8e7f1331c52a731ed33c301c91b4631918da6524"} Oct 01 13:55:17 crc kubenswrapper[4749]: I1001 13:55:17.810022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6qvg" event={"ID":"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7","Type":"ContainerStarted","Data":"d03bad193167ed8a181c1cfb72b6ba9d5316f857fff60b2d4014e6f637fdf6f4"} Oct 01 13:55:18 crc kubenswrapper[4749]: I1001 13:55:18.824702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6qvg" event={"ID":"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7","Type":"ContainerStarted","Data":"61cb66c6aacac7580df74f9a44023f0c74a360b58bb161751b0e40e3c952d016"} Oct 01 13:55:19 crc kubenswrapper[4749]: I1001 13:55:19.836162 4749 generic.go:334] "Generic (PLEG): container finished" podID="2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" containerID="61cb66c6aacac7580df74f9a44023f0c74a360b58bb161751b0e40e3c952d016" exitCode=0 Oct 01 13:55:19 crc kubenswrapper[4749]: I1001 13:55:19.836245 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6qvg" event={"ID":"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7","Type":"ContainerDied","Data":"61cb66c6aacac7580df74f9a44023f0c74a360b58bb161751b0e40e3c952d016"} Oct 01 13:55:20 crc kubenswrapper[4749]: I1001 13:55:20.860270 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6qvg" event={"ID":"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7","Type":"ContainerStarted","Data":"35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0"} Oct 01 13:55:20 crc kubenswrapper[4749]: I1001 13:55:20.883969 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x6qvg" podStartSLOduration=2.30046455 podStartE2EDuration="4.883947429s" podCreationTimestamp="2025-10-01 13:55:16 +0000 UTC" firstStartedPulling="2025-10-01 13:55:17.812621703 +0000 UTC m=+2977.866606612" lastFinishedPulling="2025-10-01 13:55:20.396104582 +0000 UTC m=+2980.450089491" observedRunningTime="2025-10-01 13:55:20.876594427 +0000 UTC m=+2980.930579356" watchObservedRunningTime="2025-10-01 13:55:20.883947429 +0000 UTC m=+2980.937932328" Oct 01 13:55:21 crc kubenswrapper[4749]: I1001 13:55:21.602181 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:21 crc kubenswrapper[4749]: I1001 13:55:21.602264 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:21 crc kubenswrapper[4749]: I1001 13:55:21.659746 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:21 crc kubenswrapper[4749]: I1001 13:55:21.937162 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:23 crc kubenswrapper[4749]: I1001 13:55:23.836733 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hz9j8"] Oct 01 13:55:23 crc kubenswrapper[4749]: I1001 13:55:23.895632 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hz9j8" podUID="6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" containerName="registry-server" containerID="cri-o://40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd" gracePeriod=2 Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.449674 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.650606 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-catalog-content\") pod \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\" (UID: \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\") " Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.650829 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-utilities\") pod \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\" (UID: \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\") " Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.650985 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvrzt\" (UniqueName: \"kubernetes.io/projected/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-kube-api-access-wvrzt\") pod \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\" (UID: \"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d\") " Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.652464 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-utilities" (OuterVolumeSpecName: "utilities") pod "6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" (UID: "6430f74f-c3fe-4b2c-b73e-1a483e3fba6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.653939 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.677263 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-kube-api-access-wvrzt" (OuterVolumeSpecName: "kube-api-access-wvrzt") pod "6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" (UID: "6430f74f-c3fe-4b2c-b73e-1a483e3fba6d"). InnerVolumeSpecName "kube-api-access-wvrzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.688362 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" (UID: "6430f74f-c3fe-4b2c-b73e-1a483e3fba6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.755786 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvrzt\" (UniqueName: \"kubernetes.io/projected/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-kube-api-access-wvrzt\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.755840 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.909579 4749 generic.go:334] "Generic (PLEG): container finished" podID="6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" containerID="40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd" exitCode=0 Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.909793 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hz9j8" Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.909850 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hz9j8" event={"ID":"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d","Type":"ContainerDied","Data":"40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd"} Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.910556 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hz9j8" event={"ID":"6430f74f-c3fe-4b2c-b73e-1a483e3fba6d","Type":"ContainerDied","Data":"81954b44160af45d58a61cb83ad43504d86131906f087e799a316d9c135f7d72"} Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.910599 4749 scope.go:117] "RemoveContainer" containerID="40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd" Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.958799 4749 scope.go:117] "RemoveContainer" containerID="5c995a63dbe043dfe792797bb0a42cef038c3bb93a6df8baa0cc093b443d5bd3" Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.966182 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hz9j8"] Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.983475 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hz9j8"] Oct 01 13:55:24 crc kubenswrapper[4749]: I1001 13:55:24.985516 4749 scope.go:117] "RemoveContainer" containerID="3934a4653b6d236195c3f57e99dd9d4e368ce56940b5376eedcf5249a2b3afe3" Oct 01 13:55:25 crc kubenswrapper[4749]: I1001 13:55:25.047578 4749 scope.go:117] "RemoveContainer" containerID="40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd" Oct 01 13:55:25 crc kubenswrapper[4749]: E1001 13:55:25.048189 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd\": container with ID starting with 40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd not found: ID does not exist" containerID="40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd" Oct 01 13:55:25 crc kubenswrapper[4749]: I1001 13:55:25.048252 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd"} err="failed to get container status \"40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd\": rpc error: code = NotFound desc = could not find container \"40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd\": container with ID starting with 40d724a47c5248cf4e7fcf3fc2278ed5d83a962772c82210497aaf9a240f65dd not found: ID does not exist" Oct 01 13:55:25 crc kubenswrapper[4749]: I1001 13:55:25.048275 4749 scope.go:117] "RemoveContainer" containerID="5c995a63dbe043dfe792797bb0a42cef038c3bb93a6df8baa0cc093b443d5bd3" Oct 01 13:55:25 crc kubenswrapper[4749]: E1001 13:55:25.048697 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c995a63dbe043dfe792797bb0a42cef038c3bb93a6df8baa0cc093b443d5bd3\": container with ID starting with 5c995a63dbe043dfe792797bb0a42cef038c3bb93a6df8baa0cc093b443d5bd3 not found: ID does not exist" containerID="5c995a63dbe043dfe792797bb0a42cef038c3bb93a6df8baa0cc093b443d5bd3" Oct 01 13:55:25 crc kubenswrapper[4749]: I1001 13:55:25.048716 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c995a63dbe043dfe792797bb0a42cef038c3bb93a6df8baa0cc093b443d5bd3"} err="failed to get container status \"5c995a63dbe043dfe792797bb0a42cef038c3bb93a6df8baa0cc093b443d5bd3\": rpc error: code = NotFound desc = could not find container \"5c995a63dbe043dfe792797bb0a42cef038c3bb93a6df8baa0cc093b443d5bd3\": container with ID starting with 5c995a63dbe043dfe792797bb0a42cef038c3bb93a6df8baa0cc093b443d5bd3 not found: ID does not exist" Oct 01 13:55:25 crc kubenswrapper[4749]: I1001 13:55:25.048728 4749 scope.go:117] "RemoveContainer" containerID="3934a4653b6d236195c3f57e99dd9d4e368ce56940b5376eedcf5249a2b3afe3" Oct 01 13:55:25 crc kubenswrapper[4749]: E1001 13:55:25.049176 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3934a4653b6d236195c3f57e99dd9d4e368ce56940b5376eedcf5249a2b3afe3\": container with ID starting with 3934a4653b6d236195c3f57e99dd9d4e368ce56940b5376eedcf5249a2b3afe3 not found: ID does not exist" containerID="3934a4653b6d236195c3f57e99dd9d4e368ce56940b5376eedcf5249a2b3afe3" Oct 01 13:55:25 crc kubenswrapper[4749]: I1001 13:55:25.049527 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3934a4653b6d236195c3f57e99dd9d4e368ce56940b5376eedcf5249a2b3afe3"} err="failed to get container status \"3934a4653b6d236195c3f57e99dd9d4e368ce56940b5376eedcf5249a2b3afe3\": rpc error: code = NotFound desc = could not find container \"3934a4653b6d236195c3f57e99dd9d4e368ce56940b5376eedcf5249a2b3afe3\": container with ID starting with 3934a4653b6d236195c3f57e99dd9d4e368ce56940b5376eedcf5249a2b3afe3 not found: ID does not exist" Oct 01 13:55:25 crc kubenswrapper[4749]: I1001 13:55:25.243904 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" path="/var/lib/kubelet/pods/6430f74f-c3fe-4b2c-b73e-1a483e3fba6d/volumes" Oct 01 13:55:26 crc kubenswrapper[4749]: I1001 13:55:26.796793 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:26 crc kubenswrapper[4749]: I1001 13:55:26.797168 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:26 crc kubenswrapper[4749]: I1001 13:55:26.872654 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:26 crc kubenswrapper[4749]: I1001 13:55:26.980691 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:28 crc kubenswrapper[4749]: I1001 13:55:28.046108 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x6qvg"] Oct 01 13:55:28 crc kubenswrapper[4749]: I1001 13:55:28.956876 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x6qvg" podUID="2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" containerName="registry-server" containerID="cri-o://35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0" gracePeriod=2 Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.475795 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.575383 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29gqb\" (UniqueName: \"kubernetes.io/projected/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-kube-api-access-29gqb\") pod \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\" (UID: \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\") " Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.575872 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-catalog-content\") pod \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\" (UID: \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\") " Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.575943 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-utilities\") pod \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\" (UID: \"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7\") " Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.577326 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-utilities" (OuterVolumeSpecName: "utilities") pod "2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" (UID: "2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.581391 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-kube-api-access-29gqb" (OuterVolumeSpecName: "kube-api-access-29gqb") pod "2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" (UID: "2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7"). InnerVolumeSpecName "kube-api-access-29gqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.677107 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.677139 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29gqb\" (UniqueName: \"kubernetes.io/projected/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-kube-api-access-29gqb\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.865127 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" (UID: "2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.879087 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.970267 4749 generic.go:334] "Generic (PLEG): container finished" podID="2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" containerID="35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0" exitCode=0 Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.970304 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6qvg" event={"ID":"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7","Type":"ContainerDied","Data":"35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0"} Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.970330 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6qvg" event={"ID":"2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7","Type":"ContainerDied","Data":"d03bad193167ed8a181c1cfb72b6ba9d5316f857fff60b2d4014e6f637fdf6f4"} Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.970351 4749 scope.go:117] "RemoveContainer" containerID="35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0" Oct 01 13:55:29 crc kubenswrapper[4749]: I1001 13:55:29.970479 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6qvg" Oct 01 13:55:30 crc kubenswrapper[4749]: I1001 13:55:30.009002 4749 scope.go:117] "RemoveContainer" containerID="61cb66c6aacac7580df74f9a44023f0c74a360b58bb161751b0e40e3c952d016" Oct 01 13:55:30 crc kubenswrapper[4749]: I1001 13:55:30.019679 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x6qvg"] Oct 01 13:55:30 crc kubenswrapper[4749]: I1001 13:55:30.029383 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x6qvg"] Oct 01 13:55:30 crc kubenswrapper[4749]: I1001 13:55:30.039304 4749 scope.go:117] "RemoveContainer" containerID="3ae7ee96ffd69728cd6c81ad8e7f1331c52a731ed33c301c91b4631918da6524" Oct 01 13:55:30 crc kubenswrapper[4749]: E1001 13:55:30.070910 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d32d2bb_4dd8_48f1_a566_a74cd8c43eb7.slice\": RecentStats: unable to find data in memory cache]" Oct 01 13:55:30 crc kubenswrapper[4749]: I1001 13:55:30.080856 4749 scope.go:117] "RemoveContainer" containerID="35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0" Oct 01 13:55:30 crc kubenswrapper[4749]: E1001 13:55:30.081368 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0\": container with ID starting with 35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0 not found: ID does not exist" containerID="35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0" Oct 01 13:55:30 crc kubenswrapper[4749]: I1001 13:55:30.081400 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0"} err="failed to get container status \"35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0\": rpc error: code = NotFound desc = could not find container \"35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0\": container with ID starting with 35ecde03c1f13faca5888427fbda040405556e511634e26f8342bdea2ceed0e0 not found: ID does not exist" Oct 01 13:55:30 crc kubenswrapper[4749]: I1001 13:55:30.081421 4749 scope.go:117] "RemoveContainer" containerID="61cb66c6aacac7580df74f9a44023f0c74a360b58bb161751b0e40e3c952d016" Oct 01 13:55:30 crc kubenswrapper[4749]: E1001 13:55:30.081603 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61cb66c6aacac7580df74f9a44023f0c74a360b58bb161751b0e40e3c952d016\": container with ID starting with 61cb66c6aacac7580df74f9a44023f0c74a360b58bb161751b0e40e3c952d016 not found: ID does not exist" containerID="61cb66c6aacac7580df74f9a44023f0c74a360b58bb161751b0e40e3c952d016" Oct 01 13:55:30 crc kubenswrapper[4749]: I1001 13:55:30.081619 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61cb66c6aacac7580df74f9a44023f0c74a360b58bb161751b0e40e3c952d016"} err="failed to get container status \"61cb66c6aacac7580df74f9a44023f0c74a360b58bb161751b0e40e3c952d016\": rpc error: code = NotFound desc = could not find container \"61cb66c6aacac7580df74f9a44023f0c74a360b58bb161751b0e40e3c952d016\": container with ID starting with 61cb66c6aacac7580df74f9a44023f0c74a360b58bb161751b0e40e3c952d016 not found: ID does not exist" Oct 01 13:55:30 crc kubenswrapper[4749]: I1001 13:55:30.081630 4749 scope.go:117] "RemoveContainer" containerID="3ae7ee96ffd69728cd6c81ad8e7f1331c52a731ed33c301c91b4631918da6524" Oct 01 13:55:30 crc kubenswrapper[4749]: E1001 13:55:30.081772 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ae7ee96ffd69728cd6c81ad8e7f1331c52a731ed33c301c91b4631918da6524\": container with ID starting with 3ae7ee96ffd69728cd6c81ad8e7f1331c52a731ed33c301c91b4631918da6524 not found: ID does not exist" containerID="3ae7ee96ffd69728cd6c81ad8e7f1331c52a731ed33c301c91b4631918da6524" Oct 01 13:55:30 crc kubenswrapper[4749]: I1001 13:55:30.081793 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ae7ee96ffd69728cd6c81ad8e7f1331c52a731ed33c301c91b4631918da6524"} err="failed to get container status \"3ae7ee96ffd69728cd6c81ad8e7f1331c52a731ed33c301c91b4631918da6524\": rpc error: code = NotFound desc = could not find container \"3ae7ee96ffd69728cd6c81ad8e7f1331c52a731ed33c301c91b4631918da6524\": container with ID starting with 3ae7ee96ffd69728cd6c81ad8e7f1331c52a731ed33c301c91b4631918da6524 not found: ID does not exist" Oct 01 13:55:30 crc kubenswrapper[4749]: I1001 13:55:30.230647 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:55:30 crc kubenswrapper[4749]: E1001 13:55:30.231853 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:55:31 crc kubenswrapper[4749]: I1001 13:55:31.243383 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" path="/var/lib/kubelet/pods/2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7/volumes" Oct 01 13:55:42 crc kubenswrapper[4749]: I1001 13:55:42.096290 4749 generic.go:334] "Generic (PLEG): container finished" podID="a74a77b0-6409-400a-a75c-115e2b2cba85" containerID="f7d3b7d70a7da5bb398996210d9869e776c0fdee3a6fd7d542c0acb57fd5f99e" exitCode=0 Oct 01 13:55:42 crc kubenswrapper[4749]: I1001 13:55:42.096410 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" event={"ID":"a74a77b0-6409-400a-a75c-115e2b2cba85","Type":"ContainerDied","Data":"f7d3b7d70a7da5bb398996210d9869e776c0fdee3a6fd7d542c0acb57fd5f99e"} Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.234411 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:55:43 crc kubenswrapper[4749]: E1001 13:55:43.234669 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.540398 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.669204 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-inventory\") pod \"a74a77b0-6409-400a-a75c-115e2b2cba85\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.669371 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7cxk\" (UniqueName: \"kubernetes.io/projected/a74a77b0-6409-400a-a75c-115e2b2cba85-kube-api-access-m7cxk\") pod \"a74a77b0-6409-400a-a75c-115e2b2cba85\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.669423 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-2\") pod \"a74a77b0-6409-400a-a75c-115e2b2cba85\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.669456 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-1\") pod \"a74a77b0-6409-400a-a75c-115e2b2cba85\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.669501 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-0\") pod \"a74a77b0-6409-400a-a75c-115e2b2cba85\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.669518 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-telemetry-combined-ca-bundle\") pod \"a74a77b0-6409-400a-a75c-115e2b2cba85\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.669546 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ssh-key\") pod \"a74a77b0-6409-400a-a75c-115e2b2cba85\" (UID: \"a74a77b0-6409-400a-a75c-115e2b2cba85\") " Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.679445 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74a77b0-6409-400a-a75c-115e2b2cba85-kube-api-access-m7cxk" (OuterVolumeSpecName: "kube-api-access-m7cxk") pod "a74a77b0-6409-400a-a75c-115e2b2cba85" (UID: "a74a77b0-6409-400a-a75c-115e2b2cba85"). InnerVolumeSpecName "kube-api-access-m7cxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.693556 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a74a77b0-6409-400a-a75c-115e2b2cba85" (UID: "a74a77b0-6409-400a-a75c-115e2b2cba85"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.700009 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a74a77b0-6409-400a-a75c-115e2b2cba85" (UID: "a74a77b0-6409-400a-a75c-115e2b2cba85"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.701118 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a74a77b0-6409-400a-a75c-115e2b2cba85" (UID: "a74a77b0-6409-400a-a75c-115e2b2cba85"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.701576 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-inventory" (OuterVolumeSpecName: "inventory") pod "a74a77b0-6409-400a-a75c-115e2b2cba85" (UID: "a74a77b0-6409-400a-a75c-115e2b2cba85"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.707569 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a74a77b0-6409-400a-a75c-115e2b2cba85" (UID: "a74a77b0-6409-400a-a75c-115e2b2cba85"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.719839 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a74a77b0-6409-400a-a75c-115e2b2cba85" (UID: "a74a77b0-6409-400a-a75c-115e2b2cba85"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.771489 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.771530 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.771540 4749 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.771550 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.771560 4749 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.771568 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7cxk\" (UniqueName: \"kubernetes.io/projected/a74a77b0-6409-400a-a75c-115e2b2cba85-kube-api-access-m7cxk\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:43 crc kubenswrapper[4749]: I1001 13:55:43.771577 4749 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a74a77b0-6409-400a-a75c-115e2b2cba85-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 01 13:55:44 crc kubenswrapper[4749]: I1001 13:55:44.117834 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" event={"ID":"a74a77b0-6409-400a-a75c-115e2b2cba85","Type":"ContainerDied","Data":"9f4b707f3ccd9575c5ffa90925c06f91510444cee15eafbba64a6537cb6185cd"} Oct 01 13:55:44 crc kubenswrapper[4749]: I1001 13:55:44.118154 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f4b707f3ccd9575c5ffa90925c06f91510444cee15eafbba64a6537cb6185cd" Oct 01 13:55:44 crc kubenswrapper[4749]: I1001 13:55:44.117926 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6" Oct 01 13:55:56 crc kubenswrapper[4749]: I1001 13:55:56.233464 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:55:56 crc kubenswrapper[4749]: E1001 13:55:56.234550 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:56:09 crc kubenswrapper[4749]: I1001 13:56:09.229677 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:56:09 crc kubenswrapper[4749]: E1001 13:56:09.230473 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:56:20 crc kubenswrapper[4749]: I1001 13:56:20.882580 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:56:20 crc kubenswrapper[4749]: I1001 13:56:20.883395 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="prometheus" containerID="cri-o://a5ddceb0103edfb2ed73b6d0cc64290e1dc2ddda8428da5528f36c1520142145" gracePeriod=600 Oct 01 13:56:20 crc kubenswrapper[4749]: I1001 13:56:20.883478 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="thanos-sidecar" containerID="cri-o://b42e631344599e34e1e9b58969e05d024e40b0600037bad7b8df7b92d726f929" gracePeriod=600 Oct 01 13:56:20 crc kubenswrapper[4749]: I1001 13:56:20.883505 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="config-reloader" containerID="cri-o://a6f3daa1c0f587d33e4df7a64667922fc219b3811b5d08d4866b6f5616863a10" gracePeriod=600 Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.241171 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:56:21 crc kubenswrapper[4749]: E1001 13:56:21.241692 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.507937 4749 generic.go:334] "Generic (PLEG): container finished" podID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerID="b42e631344599e34e1e9b58969e05d024e40b0600037bad7b8df7b92d726f929" exitCode=0 Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.507978 4749 generic.go:334] "Generic (PLEG): container finished" podID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerID="a6f3daa1c0f587d33e4df7a64667922fc219b3811b5d08d4866b6f5616863a10" exitCode=0 Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.507986 4749 generic.go:334] "Generic (PLEG): container finished" podID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerID="a5ddceb0103edfb2ed73b6d0cc64290e1dc2ddda8428da5528f36c1520142145" exitCode=0 Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.507988 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bce57e9e-7e46-4ac2-a709-e978a98e4575","Type":"ContainerDied","Data":"b42e631344599e34e1e9b58969e05d024e40b0600037bad7b8df7b92d726f929"} Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.508089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bce57e9e-7e46-4ac2-a709-e978a98e4575","Type":"ContainerDied","Data":"a6f3daa1c0f587d33e4df7a64667922fc219b3811b5d08d4866b6f5616863a10"} Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.508159 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bce57e9e-7e46-4ac2-a709-e978a98e4575","Type":"ContainerDied","Data":"a5ddceb0103edfb2ed73b6d0cc64290e1dc2ddda8428da5528f36c1520142145"} Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.884495 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.985718 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6n79\" (UniqueName: \"kubernetes.io/projected/bce57e9e-7e46-4ac2-a709-e978a98e4575-kube-api-access-g6n79\") pod \"bce57e9e-7e46-4ac2-a709-e978a98e4575\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.985765 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-secret-combined-ca-bundle\") pod \"bce57e9e-7e46-4ac2-a709-e978a98e4575\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.985801 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bce57e9e-7e46-4ac2-a709-e978a98e4575-tls-assets\") pod \"bce57e9e-7e46-4ac2-a709-e978a98e4575\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.985835 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-thanos-prometheus-http-client-file\") pod \"bce57e9e-7e46-4ac2-a709-e978a98e4575\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.985923 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"bce57e9e-7e46-4ac2-a709-e978a98e4575\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.985953 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config\") pod \"bce57e9e-7e46-4ac2-a709-e978a98e4575\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.985981 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-config\") pod \"bce57e9e-7e46-4ac2-a709-e978a98e4575\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.986102 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"bce57e9e-7e46-4ac2-a709-e978a98e4575\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.986159 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bce57e9e-7e46-4ac2-a709-e978a98e4575-prometheus-metric-storage-rulefiles-0\") pod \"bce57e9e-7e46-4ac2-a709-e978a98e4575\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.986189 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bce57e9e-7e46-4ac2-a709-e978a98e4575-config-out\") pod \"bce57e9e-7e46-4ac2-a709-e978a98e4575\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.986220 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"bce57e9e-7e46-4ac2-a709-e978a98e4575\" (UID: \"bce57e9e-7e46-4ac2-a709-e978a98e4575\") " Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.987017 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bce57e9e-7e46-4ac2-a709-e978a98e4575-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "bce57e9e-7e46-4ac2-a709-e978a98e4575" (UID: "bce57e9e-7e46-4ac2-a709-e978a98e4575"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.993399 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-config" (OuterVolumeSpecName: "config") pod "bce57e9e-7e46-4ac2-a709-e978a98e4575" (UID: "bce57e9e-7e46-4ac2-a709-e978a98e4575"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.993993 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "bce57e9e-7e46-4ac2-a709-e978a98e4575" (UID: "bce57e9e-7e46-4ac2-a709-e978a98e4575"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.994033 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce57e9e-7e46-4ac2-a709-e978a98e4575-kube-api-access-g6n79" (OuterVolumeSpecName: "kube-api-access-g6n79") pod "bce57e9e-7e46-4ac2-a709-e978a98e4575" (UID: "bce57e9e-7e46-4ac2-a709-e978a98e4575"). InnerVolumeSpecName "kube-api-access-g6n79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.994070 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "bce57e9e-7e46-4ac2-a709-e978a98e4575" (UID: "bce57e9e-7e46-4ac2-a709-e978a98e4575"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.994304 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce57e9e-7e46-4ac2-a709-e978a98e4575-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "bce57e9e-7e46-4ac2-a709-e978a98e4575" (UID: "bce57e9e-7e46-4ac2-a709-e978a98e4575"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.994456 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "bce57e9e-7e46-4ac2-a709-e978a98e4575" (UID: "bce57e9e-7e46-4ac2-a709-e978a98e4575"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.995034 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bce57e9e-7e46-4ac2-a709-e978a98e4575-config-out" (OuterVolumeSpecName: "config-out") pod "bce57e9e-7e46-4ac2-a709-e978a98e4575" (UID: "bce57e9e-7e46-4ac2-a709-e978a98e4575"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 13:56:21 crc kubenswrapper[4749]: I1001 13:56:21.996161 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "bce57e9e-7e46-4ac2-a709-e978a98e4575" (UID: "bce57e9e-7e46-4ac2-a709-e978a98e4575"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.012209 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "bce57e9e-7e46-4ac2-a709-e978a98e4575" (UID: "bce57e9e-7e46-4ac2-a709-e978a98e4575"). InnerVolumeSpecName "pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.065829 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config" (OuterVolumeSpecName: "web-config") pod "bce57e9e-7e46-4ac2-a709-e978a98e4575" (UID: "bce57e9e-7e46-4ac2-a709-e978a98e4575"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.088472 4749 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.088507 4749 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.088541 4749 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.088551 4749 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-config\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.088587 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") on node \"crc\" " Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.088600 4749 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/bce57e9e-7e46-4ac2-a709-e978a98e4575-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.088609 4749 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bce57e9e-7e46-4ac2-a709-e978a98e4575-config-out\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.088619 4749 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.088629 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6n79\" (UniqueName: \"kubernetes.io/projected/bce57e9e-7e46-4ac2-a709-e978a98e4575-kube-api-access-g6n79\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.088638 4749 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bce57e9e-7e46-4ac2-a709-e978a98e4575-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.088646 4749 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bce57e9e-7e46-4ac2-a709-e978a98e4575-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.111393 4749 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.111715 4749 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a") on node "crc" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.190399 4749 reconciler_common.go:293] "Volume detached for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") on node \"crc\" DevicePath \"\"" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.518678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"bce57e9e-7e46-4ac2-a709-e978a98e4575","Type":"ContainerDied","Data":"aababca1d28fc74c9a344b5824e68f9ee0f357bc46629e420429ddf9f32190f4"} Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.518727 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.518736 4749 scope.go:117] "RemoveContainer" containerID="b42e631344599e34e1e9b58969e05d024e40b0600037bad7b8df7b92d726f929" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.543652 4749 scope.go:117] "RemoveContainer" containerID="a6f3daa1c0f587d33e4df7a64667922fc219b3811b5d08d4866b6f5616863a10" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.558515 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.572063 4749 scope.go:117] "RemoveContainer" containerID="a5ddceb0103edfb2ed73b6d0cc64290e1dc2ddda8428da5528f36c1520142145" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.576571 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.598275 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:56:22 crc kubenswrapper[4749]: E1001 13:56:22.598714 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" containerName="extract-content" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.598734 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" containerName="extract-content" Oct 01 13:56:22 crc kubenswrapper[4749]: E1001 13:56:22.598755 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" containerName="extract-content" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.598761 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" containerName="extract-content" Oct 01 13:56:22 crc kubenswrapper[4749]: E1001 13:56:22.598777 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="init-config-reloader" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.598783 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="init-config-reloader" Oct 01 13:56:22 crc kubenswrapper[4749]: E1001 13:56:22.598795 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" containerName="registry-server" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.598801 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" containerName="registry-server" Oct 01 13:56:22 crc kubenswrapper[4749]: E1001 13:56:22.598818 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="config-reloader" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.598826 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="config-reloader" Oct 01 13:56:22 crc kubenswrapper[4749]: E1001 13:56:22.598846 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" containerName="extract-utilities" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.598854 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" containerName="extract-utilities" Oct 01 13:56:22 crc kubenswrapper[4749]: E1001 13:56:22.598866 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" containerName="registry-server" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.598876 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" containerName="registry-server" Oct 01 13:56:22 crc kubenswrapper[4749]: E1001 13:56:22.598886 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" containerName="extract-utilities" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.598893 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" containerName="extract-utilities" Oct 01 13:56:22 crc kubenswrapper[4749]: E1001 13:56:22.598905 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="prometheus" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.598912 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="prometheus" Oct 01 13:56:22 crc kubenswrapper[4749]: E1001 13:56:22.598926 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="thanos-sidecar" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.598933 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="thanos-sidecar" Oct 01 13:56:22 crc kubenswrapper[4749]: E1001 13:56:22.598954 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74a77b0-6409-400a-a75c-115e2b2cba85" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.598964 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74a77b0-6409-400a-a75c-115e2b2cba85" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.599177 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d32d2bb-4dd8-48f1-a566-a74cd8c43eb7" containerName="registry-server" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.599194 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74a77b0-6409-400a-a75c-115e2b2cba85" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.599202 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="thanos-sidecar" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.599218 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="prometheus" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.599245 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" containerName="config-reloader" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.599260 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6430f74f-c3fe-4b2c-b73e-1a483e3fba6d" containerName="registry-server" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.601144 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.603020 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-df59c" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.603186 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.603682 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.603836 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.608903 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.613339 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.632874 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.673494 4749 scope.go:117] "RemoveContainer" containerID="fad1cb8eb096fc62719b51016d81f562e1e5098c6f44ac776c43ab82da9ec44d" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.711588 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.711634 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.711665 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6877d0fa-8236-4975-af20-88d438464469-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.711700 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-config\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.711719 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.711761 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6877d0fa-8236-4975-af20-88d438464469-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.711800 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6877d0fa-8236-4975-af20-88d438464469-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.711973 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.712104 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.712290 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.712307 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5thr\" (UniqueName: \"kubernetes.io/projected/6877d0fa-8236-4975-af20-88d438464469-kube-api-access-b5thr\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.814183 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-config\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.814244 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.814309 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6877d0fa-8236-4975-af20-88d438464469-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.814361 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6877d0fa-8236-4975-af20-88d438464469-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.814402 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.814453 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.814515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.814536 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5thr\" (UniqueName: \"kubernetes.io/projected/6877d0fa-8236-4975-af20-88d438464469-kube-api-access-b5thr\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.814590 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.814634 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.814671 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6877d0fa-8236-4975-af20-88d438464469-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.815479 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6877d0fa-8236-4975-af20-88d438464469-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.817319 4749 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.817345 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/486c2550edb1c82035b2963cc60708287426e0ee5d361c9f2f060b32a3c68a50/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.818433 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.818966 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.819057 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6877d0fa-8236-4975-af20-88d438464469-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.819199 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-config\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.819380 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.819940 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6877d0fa-8236-4975-af20-88d438464469-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.820124 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.820886 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/6877d0fa-8236-4975-af20-88d438464469-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.831565 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5thr\" (UniqueName: \"kubernetes.io/projected/6877d0fa-8236-4975-af20-88d438464469-kube-api-access-b5thr\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.849951 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6930b037-a2fd-4818-8f2d-e53e23e30c5a\") pod \"prometheus-metric-storage-0\" (UID: \"6877d0fa-8236-4975-af20-88d438464469\") " pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:22 crc kubenswrapper[4749]: I1001 13:56:22.923784 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:23 crc kubenswrapper[4749]: I1001 13:56:23.241641 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce57e9e-7e46-4ac2-a709-e978a98e4575" path="/var/lib/kubelet/pods/bce57e9e-7e46-4ac2-a709-e978a98e4575/volumes" Oct 01 13:56:23 crc kubenswrapper[4749]: I1001 13:56:23.449052 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 01 13:56:23 crc kubenswrapper[4749]: I1001 13:56:23.530510 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6877d0fa-8236-4975-af20-88d438464469","Type":"ContainerStarted","Data":"16f646552d552efcb69e872b935b3791be262194647de1b9841c56ad0dd6ea89"} Oct 01 13:56:28 crc kubenswrapper[4749]: I1001 13:56:28.594008 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6877d0fa-8236-4975-af20-88d438464469","Type":"ContainerStarted","Data":"f591a8a07aef06c7f7454de8029a9ed1bf9947c4b0449bec9aa132f0facff053"} Oct 01 13:56:33 crc kubenswrapper[4749]: I1001 13:56:33.229740 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:56:33 crc kubenswrapper[4749]: E1001 13:56:33.230566 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:56:36 crc kubenswrapper[4749]: I1001 13:56:36.697088 4749 generic.go:334] "Generic (PLEG): container finished" podID="6877d0fa-8236-4975-af20-88d438464469" containerID="f591a8a07aef06c7f7454de8029a9ed1bf9947c4b0449bec9aa132f0facff053" exitCode=0 Oct 01 13:56:36 crc kubenswrapper[4749]: I1001 13:56:36.697184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6877d0fa-8236-4975-af20-88d438464469","Type":"ContainerDied","Data":"f591a8a07aef06c7f7454de8029a9ed1bf9947c4b0449bec9aa132f0facff053"} Oct 01 13:56:37 crc kubenswrapper[4749]: I1001 13:56:37.712590 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6877d0fa-8236-4975-af20-88d438464469","Type":"ContainerStarted","Data":"17c6dd4c24fcbc1915330d7d0980ab88edd0444bb8cdb9071bc17b68b84d243b"} Oct 01 13:56:40 crc kubenswrapper[4749]: I1001 13:56:40.750857 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6877d0fa-8236-4975-af20-88d438464469","Type":"ContainerStarted","Data":"e884b1051d378be790f7e5b652f34f274acff051a666374239cb501434abae8a"} Oct 01 13:56:40 crc kubenswrapper[4749]: I1001 13:56:40.751504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6877d0fa-8236-4975-af20-88d438464469","Type":"ContainerStarted","Data":"ba06077c43c80194e59f23edd128140dab4fddbe58c6d14bed7118672fe6d6e1"} Oct 01 13:56:40 crc kubenswrapper[4749]: I1001 13:56:40.776339 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.776319376 podStartE2EDuration="18.776319376s" podCreationTimestamp="2025-10-01 13:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 13:56:40.770235511 +0000 UTC m=+3060.824220420" watchObservedRunningTime="2025-10-01 13:56:40.776319376 +0000 UTC m=+3060.830304275" Oct 01 13:56:42 crc kubenswrapper[4749]: I1001 13:56:42.924595 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:47 crc kubenswrapper[4749]: I1001 13:56:47.229800 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:56:47 crc kubenswrapper[4749]: E1001 13:56:47.230646 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:56:52 crc kubenswrapper[4749]: I1001 13:56:52.924226 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:52 crc kubenswrapper[4749]: I1001 13:56:52.928912 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 01 13:56:53 crc kubenswrapper[4749]: I1001 13:56:53.882755 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 01 13:57:00 crc kubenswrapper[4749]: I1001 13:57:00.230539 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:57:00 crc kubenswrapper[4749]: E1001 13:57:00.231109 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:57:13 crc kubenswrapper[4749]: I1001 13:57:13.239604 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:57:13 crc kubenswrapper[4749]: E1001 13:57:13.240423 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.324173 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.326205 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.328096 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.328204 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.328668 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.335678 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-sfnbt" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.343626 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.525171 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-config-data\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.525569 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.525598 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.525628 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.525653 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.525714 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.525737 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg9pn\" (UniqueName: \"kubernetes.io/projected/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-kube-api-access-mg9pn\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.526259 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.526343 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.628638 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-config-data\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.628739 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.628778 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.628815 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.628860 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.628950 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.628981 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg9pn\" (UniqueName: \"kubernetes.io/projected/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-kube-api-access-mg9pn\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.629040 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.629089 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.630380 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.630383 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.630750 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.631120 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.634829 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-config-data\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.636377 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.646904 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.647948 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.662811 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg9pn\" (UniqueName: \"kubernetes.io/projected/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-kube-api-access-mg9pn\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.684100 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " pod="openstack/tempest-tests-tempest" Oct 01 13:57:15 crc kubenswrapper[4749]: I1001 13:57:15.952236 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 13:57:16 crc kubenswrapper[4749]: I1001 13:57:16.431841 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 13:57:17 crc kubenswrapper[4749]: I1001 13:57:17.114610 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d","Type":"ContainerStarted","Data":"15c30f8d386f30b556bc7e303126366e4eb5b0ddf63bbfbfe9c12678efe242fa"} Oct 01 13:57:28 crc kubenswrapper[4749]: I1001 13:57:28.230418 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:57:28 crc kubenswrapper[4749]: E1001 13:57:28.231747 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:57:30 crc kubenswrapper[4749]: I1001 13:57:30.270318 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d","Type":"ContainerStarted","Data":"848c24291842e13f039a155fa65ab0508347882de0b767542099dbcc3c422de4"} Oct 01 13:57:30 crc kubenswrapper[4749]: I1001 13:57:30.297205 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.000102649 podStartE2EDuration="16.297181909s" podCreationTimestamp="2025-10-01 13:57:14 +0000 UTC" firstStartedPulling="2025-10-01 13:57:16.440649208 +0000 UTC m=+3096.494634107" lastFinishedPulling="2025-10-01 13:57:28.737728468 +0000 UTC m=+3108.791713367" observedRunningTime="2025-10-01 13:57:30.295403247 +0000 UTC m=+3110.349388146" watchObservedRunningTime="2025-10-01 13:57:30.297181909 +0000 UTC m=+3110.351166828" Oct 01 13:57:39 crc kubenswrapper[4749]: I1001 13:57:39.230182 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:57:39 crc kubenswrapper[4749]: E1001 13:57:39.231361 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:57:50 crc kubenswrapper[4749]: I1001 13:57:50.231256 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:57:50 crc kubenswrapper[4749]: E1001 13:57:50.231972 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:58:03 crc kubenswrapper[4749]: I1001 13:58:03.230152 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:58:03 crc kubenswrapper[4749]: E1001 13:58:03.231268 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:58:18 crc kubenswrapper[4749]: I1001 13:58:18.240874 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:58:18 crc kubenswrapper[4749]: E1001 13:58:18.242075 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:58:33 crc kubenswrapper[4749]: I1001 13:58:33.230628 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:58:33 crc kubenswrapper[4749]: E1001 13:58:33.231365 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:58:46 crc kubenswrapper[4749]: I1001 13:58:46.230263 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:58:46 crc kubenswrapper[4749]: E1001 13:58:46.231510 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:58:59 crc kubenswrapper[4749]: I1001 13:58:59.230124 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:58:59 crc kubenswrapper[4749]: E1001 13:58:59.231159 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 13:59:11 crc kubenswrapper[4749]: I1001 13:59:11.239984 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 13:59:12 crc kubenswrapper[4749]: I1001 13:59:12.494926 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"5ab6ef42cbe1670573b773f73a3eec3d9985fa70a7036319c387479a68495856"} Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.165494 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn"] Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.167833 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.172343 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.172663 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.177979 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn"] Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.267921 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6njxq\" (UniqueName: \"kubernetes.io/projected/7a20b509-c4f1-469c-a228-d779f1a05c9d-kube-api-access-6njxq\") pod \"collect-profiles-29322120-7gvzn\" (UID: \"7a20b509-c4f1-469c-a228-d779f1a05c9d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.268046 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a20b509-c4f1-469c-a228-d779f1a05c9d-config-volume\") pod \"collect-profiles-29322120-7gvzn\" (UID: \"7a20b509-c4f1-469c-a228-d779f1a05c9d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.268112 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a20b509-c4f1-469c-a228-d779f1a05c9d-secret-volume\") pod \"collect-profiles-29322120-7gvzn\" (UID: \"7a20b509-c4f1-469c-a228-d779f1a05c9d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.370375 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a20b509-c4f1-469c-a228-d779f1a05c9d-secret-volume\") pod \"collect-profiles-29322120-7gvzn\" (UID: \"7a20b509-c4f1-469c-a228-d779f1a05c9d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.370620 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6njxq\" (UniqueName: \"kubernetes.io/projected/7a20b509-c4f1-469c-a228-d779f1a05c9d-kube-api-access-6njxq\") pod \"collect-profiles-29322120-7gvzn\" (UID: \"7a20b509-c4f1-469c-a228-d779f1a05c9d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.370749 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a20b509-c4f1-469c-a228-d779f1a05c9d-config-volume\") pod \"collect-profiles-29322120-7gvzn\" (UID: \"7a20b509-c4f1-469c-a228-d779f1a05c9d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.372152 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a20b509-c4f1-469c-a228-d779f1a05c9d-config-volume\") pod \"collect-profiles-29322120-7gvzn\" (UID: \"7a20b509-c4f1-469c-a228-d779f1a05c9d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.376511 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a20b509-c4f1-469c-a228-d779f1a05c9d-secret-volume\") pod \"collect-profiles-29322120-7gvzn\" (UID: \"7a20b509-c4f1-469c-a228-d779f1a05c9d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.388802 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6njxq\" (UniqueName: \"kubernetes.io/projected/7a20b509-c4f1-469c-a228-d779f1a05c9d-kube-api-access-6njxq\") pod \"collect-profiles-29322120-7gvzn\" (UID: \"7a20b509-c4f1-469c-a228-d779f1a05c9d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.494271 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:00 crc kubenswrapper[4749]: I1001 14:00:00.963847 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn"] Oct 01 14:00:01 crc kubenswrapper[4749]: I1001 14:00:01.032734 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" event={"ID":"7a20b509-c4f1-469c-a228-d779f1a05c9d","Type":"ContainerStarted","Data":"b44f35a95bbac11d31c2ef36b72e93a304da6fbf5235dd712b661785da528178"} Oct 01 14:00:02 crc kubenswrapper[4749]: I1001 14:00:02.048818 4749 generic.go:334] "Generic (PLEG): container finished" podID="7a20b509-c4f1-469c-a228-d779f1a05c9d" containerID="a21f49b8e804f53cd8d3e978b2129d45d5e3ab958cf8fa5871ac45f9b24addca" exitCode=0 Oct 01 14:00:02 crc kubenswrapper[4749]: I1001 14:00:02.048934 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" event={"ID":"7a20b509-c4f1-469c-a228-d779f1a05c9d","Type":"ContainerDied","Data":"a21f49b8e804f53cd8d3e978b2129d45d5e3ab958cf8fa5871ac45f9b24addca"} Oct 01 14:00:03 crc kubenswrapper[4749]: I1001 14:00:03.431089 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:03 crc kubenswrapper[4749]: I1001 14:00:03.634121 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a20b509-c4f1-469c-a228-d779f1a05c9d-config-volume\") pod \"7a20b509-c4f1-469c-a228-d779f1a05c9d\" (UID: \"7a20b509-c4f1-469c-a228-d779f1a05c9d\") " Oct 01 14:00:03 crc kubenswrapper[4749]: I1001 14:00:03.634354 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a20b509-c4f1-469c-a228-d779f1a05c9d-secret-volume\") pod \"7a20b509-c4f1-469c-a228-d779f1a05c9d\" (UID: \"7a20b509-c4f1-469c-a228-d779f1a05c9d\") " Oct 01 14:00:03 crc kubenswrapper[4749]: I1001 14:00:03.634394 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6njxq\" (UniqueName: \"kubernetes.io/projected/7a20b509-c4f1-469c-a228-d779f1a05c9d-kube-api-access-6njxq\") pod \"7a20b509-c4f1-469c-a228-d779f1a05c9d\" (UID: \"7a20b509-c4f1-469c-a228-d779f1a05c9d\") " Oct 01 14:00:03 crc kubenswrapper[4749]: I1001 14:00:03.635248 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a20b509-c4f1-469c-a228-d779f1a05c9d-config-volume" (OuterVolumeSpecName: "config-volume") pod "7a20b509-c4f1-469c-a228-d779f1a05c9d" (UID: "7a20b509-c4f1-469c-a228-d779f1a05c9d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:00:03 crc kubenswrapper[4749]: I1001 14:00:03.641931 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a20b509-c4f1-469c-a228-d779f1a05c9d-kube-api-access-6njxq" (OuterVolumeSpecName: "kube-api-access-6njxq") pod "7a20b509-c4f1-469c-a228-d779f1a05c9d" (UID: "7a20b509-c4f1-469c-a228-d779f1a05c9d"). InnerVolumeSpecName "kube-api-access-6njxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:00:03 crc kubenswrapper[4749]: I1001 14:00:03.661584 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a20b509-c4f1-469c-a228-d779f1a05c9d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7a20b509-c4f1-469c-a228-d779f1a05c9d" (UID: "7a20b509-c4f1-469c-a228-d779f1a05c9d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:00:03 crc kubenswrapper[4749]: I1001 14:00:03.736334 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7a20b509-c4f1-469c-a228-d779f1a05c9d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:03 crc kubenswrapper[4749]: I1001 14:00:03.736387 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6njxq\" (UniqueName: \"kubernetes.io/projected/7a20b509-c4f1-469c-a228-d779f1a05c9d-kube-api-access-6njxq\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:03 crc kubenswrapper[4749]: I1001 14:00:03.736396 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a20b509-c4f1-469c-a228-d779f1a05c9d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:00:04 crc kubenswrapper[4749]: I1001 14:00:04.072159 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" event={"ID":"7a20b509-c4f1-469c-a228-d779f1a05c9d","Type":"ContainerDied","Data":"b44f35a95bbac11d31c2ef36b72e93a304da6fbf5235dd712b661785da528178"} Oct 01 14:00:04 crc kubenswrapper[4749]: I1001 14:00:04.072512 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b44f35a95bbac11d31c2ef36b72e93a304da6fbf5235dd712b661785da528178" Oct 01 14:00:04 crc kubenswrapper[4749]: I1001 14:00:04.072607 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn" Oct 01 14:00:04 crc kubenswrapper[4749]: I1001 14:00:04.526501 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8"] Oct 01 14:00:04 crc kubenswrapper[4749]: I1001 14:00:04.538158 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322075-nrdq8"] Oct 01 14:00:05 crc kubenswrapper[4749]: I1001 14:00:05.244119 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f3049be-5196-40de-8ff1-1895937e9510" path="/var/lib/kubelet/pods/3f3049be-5196-40de-8ff1-1895937e9510/volumes" Oct 01 14:00:10 crc kubenswrapper[4749]: I1001 14:00:10.123095 4749 scope.go:117] "RemoveContainer" containerID="7bd4bbb0493c1232b12bf40d1307beb514880c0a626d7b82b2ce817ad73344e6" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.171012 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29322121-ksm5l"] Oct 01 14:01:00 crc kubenswrapper[4749]: E1001 14:01:00.171968 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a20b509-c4f1-469c-a228-d779f1a05c9d" containerName="collect-profiles" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.171983 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a20b509-c4f1-469c-a228-d779f1a05c9d" containerName="collect-profiles" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.174045 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a20b509-c4f1-469c-a228-d779f1a05c9d" containerName="collect-profiles" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.174760 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.197890 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322121-ksm5l"] Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.352268 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-config-data\") pod \"keystone-cron-29322121-ksm5l\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.352320 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-combined-ca-bundle\") pod \"keystone-cron-29322121-ksm5l\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.352415 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-fernet-keys\") pod \"keystone-cron-29322121-ksm5l\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.352495 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc2mt\" (UniqueName: \"kubernetes.io/projected/9f1c1f6f-c5a5-499c-874f-245d4d918274-kube-api-access-jc2mt\") pod \"keystone-cron-29322121-ksm5l\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.454759 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc2mt\" (UniqueName: \"kubernetes.io/projected/9f1c1f6f-c5a5-499c-874f-245d4d918274-kube-api-access-jc2mt\") pod \"keystone-cron-29322121-ksm5l\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.454841 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-config-data\") pod \"keystone-cron-29322121-ksm5l\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.454864 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-combined-ca-bundle\") pod \"keystone-cron-29322121-ksm5l\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.454983 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-fernet-keys\") pod \"keystone-cron-29322121-ksm5l\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.461540 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-fernet-keys\") pod \"keystone-cron-29322121-ksm5l\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.462212 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-config-data\") pod \"keystone-cron-29322121-ksm5l\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.478135 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc2mt\" (UniqueName: \"kubernetes.io/projected/9f1c1f6f-c5a5-499c-874f-245d4d918274-kube-api-access-jc2mt\") pod \"keystone-cron-29322121-ksm5l\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.478490 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-combined-ca-bundle\") pod \"keystone-cron-29322121-ksm5l\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:00 crc kubenswrapper[4749]: I1001 14:01:00.506573 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:01 crc kubenswrapper[4749]: I1001 14:01:01.004611 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322121-ksm5l"] Oct 01 14:01:01 crc kubenswrapper[4749]: I1001 14:01:01.732855 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322121-ksm5l" event={"ID":"9f1c1f6f-c5a5-499c-874f-245d4d918274","Type":"ContainerStarted","Data":"acd4b4bc300f597f5f9e7c6fff0a6303e2f45efe7a89527b0d9d0245f07e713b"} Oct 01 14:01:01 crc kubenswrapper[4749]: I1001 14:01:01.733258 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322121-ksm5l" event={"ID":"9f1c1f6f-c5a5-499c-874f-245d4d918274","Type":"ContainerStarted","Data":"7e29518319b39f437663a4ef9567e19da4ccb98e55ff53778471edf2729d54f0"} Oct 01 14:01:06 crc kubenswrapper[4749]: I1001 14:01:06.781449 4749 generic.go:334] "Generic (PLEG): container finished" podID="9f1c1f6f-c5a5-499c-874f-245d4d918274" containerID="acd4b4bc300f597f5f9e7c6fff0a6303e2f45efe7a89527b0d9d0245f07e713b" exitCode=0 Oct 01 14:01:06 crc kubenswrapper[4749]: I1001 14:01:06.781545 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322121-ksm5l" event={"ID":"9f1c1f6f-c5a5-499c-874f-245d4d918274","Type":"ContainerDied","Data":"acd4b4bc300f597f5f9e7c6fff0a6303e2f45efe7a89527b0d9d0245f07e713b"} Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.172791 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.256770 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-combined-ca-bundle\") pod \"9f1c1f6f-c5a5-499c-874f-245d4d918274\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.256871 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-fernet-keys\") pod \"9f1c1f6f-c5a5-499c-874f-245d4d918274\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.256971 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc2mt\" (UniqueName: \"kubernetes.io/projected/9f1c1f6f-c5a5-499c-874f-245d4d918274-kube-api-access-jc2mt\") pod \"9f1c1f6f-c5a5-499c-874f-245d4d918274\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.257039 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-config-data\") pod \"9f1c1f6f-c5a5-499c-874f-245d4d918274\" (UID: \"9f1c1f6f-c5a5-499c-874f-245d4d918274\") " Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.263071 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f1c1f6f-c5a5-499c-874f-245d4d918274-kube-api-access-jc2mt" (OuterVolumeSpecName: "kube-api-access-jc2mt") pod "9f1c1f6f-c5a5-499c-874f-245d4d918274" (UID: "9f1c1f6f-c5a5-499c-874f-245d4d918274"). InnerVolumeSpecName "kube-api-access-jc2mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.271653 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9f1c1f6f-c5a5-499c-874f-245d4d918274" (UID: "9f1c1f6f-c5a5-499c-874f-245d4d918274"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.286897 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f1c1f6f-c5a5-499c-874f-245d4d918274" (UID: "9f1c1f6f-c5a5-499c-874f-245d4d918274"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.327656 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-config-data" (OuterVolumeSpecName: "config-data") pod "9f1c1f6f-c5a5-499c-874f-245d4d918274" (UID: "9f1c1f6f-c5a5-499c-874f-245d4d918274"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.361764 4749 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.361815 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc2mt\" (UniqueName: \"kubernetes.io/projected/9f1c1f6f-c5a5-499c-874f-245d4d918274-kube-api-access-jc2mt\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.361836 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.361856 4749 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f1c1f6f-c5a5-499c-874f-245d4d918274-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.803050 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322121-ksm5l" event={"ID":"9f1c1f6f-c5a5-499c-874f-245d4d918274","Type":"ContainerDied","Data":"7e29518319b39f437663a4ef9567e19da4ccb98e55ff53778471edf2729d54f0"} Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.803108 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322121-ksm5l" Oct 01 14:01:08 crc kubenswrapper[4749]: I1001 14:01:08.803122 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e29518319b39f437663a4ef9567e19da4ccb98e55ff53778471edf2729d54f0" Oct 01 14:01:09 crc kubenswrapper[4749]: E1001 14:01:09.014148 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f1c1f6f_c5a5_499c_874f_245d4d918274.slice/crio-7e29518319b39f437663a4ef9567e19da4ccb98e55ff53778471edf2729d54f0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f1c1f6f_c5a5_499c_874f_245d4d918274.slice\": RecentStats: unable to find data in memory cache]" Oct 01 14:01:19 crc kubenswrapper[4749]: I1001 14:01:19.974299 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tggkl"] Oct 01 14:01:19 crc kubenswrapper[4749]: E1001 14:01:19.975342 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1c1f6f-c5a5-499c-874f-245d4d918274" containerName="keystone-cron" Oct 01 14:01:19 crc kubenswrapper[4749]: I1001 14:01:19.975361 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1c1f6f-c5a5-499c-874f-245d4d918274" containerName="keystone-cron" Oct 01 14:01:19 crc kubenswrapper[4749]: I1001 14:01:19.980656 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f1c1f6f-c5a5-499c-874f-245d4d918274" containerName="keystone-cron" Oct 01 14:01:19 crc kubenswrapper[4749]: I1001 14:01:19.982233 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:19 crc kubenswrapper[4749]: I1001 14:01:19.996639 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tggkl"] Oct 01 14:01:20 crc kubenswrapper[4749]: I1001 14:01:20.109764 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpvkf\" (UniqueName: \"kubernetes.io/projected/bd49ae45-1a35-47d4-953f-7d8294b339e3-kube-api-access-jpvkf\") pod \"redhat-operators-tggkl\" (UID: \"bd49ae45-1a35-47d4-953f-7d8294b339e3\") " pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:20 crc kubenswrapper[4749]: I1001 14:01:20.111246 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd49ae45-1a35-47d4-953f-7d8294b339e3-utilities\") pod \"redhat-operators-tggkl\" (UID: \"bd49ae45-1a35-47d4-953f-7d8294b339e3\") " pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:20 crc kubenswrapper[4749]: I1001 14:01:20.111393 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd49ae45-1a35-47d4-953f-7d8294b339e3-catalog-content\") pod \"redhat-operators-tggkl\" (UID: \"bd49ae45-1a35-47d4-953f-7d8294b339e3\") " pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:20 crc kubenswrapper[4749]: I1001 14:01:20.212859 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd49ae45-1a35-47d4-953f-7d8294b339e3-utilities\") pod \"redhat-operators-tggkl\" (UID: \"bd49ae45-1a35-47d4-953f-7d8294b339e3\") " pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:20 crc kubenswrapper[4749]: I1001 14:01:20.212935 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd49ae45-1a35-47d4-953f-7d8294b339e3-catalog-content\") pod \"redhat-operators-tggkl\" (UID: \"bd49ae45-1a35-47d4-953f-7d8294b339e3\") " pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:20 crc kubenswrapper[4749]: I1001 14:01:20.213002 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpvkf\" (UniqueName: \"kubernetes.io/projected/bd49ae45-1a35-47d4-953f-7d8294b339e3-kube-api-access-jpvkf\") pod \"redhat-operators-tggkl\" (UID: \"bd49ae45-1a35-47d4-953f-7d8294b339e3\") " pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:20 crc kubenswrapper[4749]: I1001 14:01:20.213908 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd49ae45-1a35-47d4-953f-7d8294b339e3-utilities\") pod \"redhat-operators-tggkl\" (UID: \"bd49ae45-1a35-47d4-953f-7d8294b339e3\") " pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:20 crc kubenswrapper[4749]: I1001 14:01:20.214249 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd49ae45-1a35-47d4-953f-7d8294b339e3-catalog-content\") pod \"redhat-operators-tggkl\" (UID: \"bd49ae45-1a35-47d4-953f-7d8294b339e3\") " pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:20 crc kubenswrapper[4749]: I1001 14:01:20.237927 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpvkf\" (UniqueName: \"kubernetes.io/projected/bd49ae45-1a35-47d4-953f-7d8294b339e3-kube-api-access-jpvkf\") pod \"redhat-operators-tggkl\" (UID: \"bd49ae45-1a35-47d4-953f-7d8294b339e3\") " pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:20 crc kubenswrapper[4749]: I1001 14:01:20.312913 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:20 crc kubenswrapper[4749]: I1001 14:01:20.764161 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tggkl"] Oct 01 14:01:20 crc kubenswrapper[4749]: I1001 14:01:20.918159 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tggkl" event={"ID":"bd49ae45-1a35-47d4-953f-7d8294b339e3","Type":"ContainerStarted","Data":"8b38ed625ea3605fe14543ccd87a6aec46dd9e8da1cd0ab89e4331d980ea3574"} Oct 01 14:01:21 crc kubenswrapper[4749]: I1001 14:01:21.935937 4749 generic.go:334] "Generic (PLEG): container finished" podID="bd49ae45-1a35-47d4-953f-7d8294b339e3" containerID="905fbca5949e0a460f12c8aa6915179a0b1bfd9aa2876e58ea5e64b70be111b9" exitCode=0 Oct 01 14:01:21 crc kubenswrapper[4749]: I1001 14:01:21.936100 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tggkl" event={"ID":"bd49ae45-1a35-47d4-953f-7d8294b339e3","Type":"ContainerDied","Data":"905fbca5949e0a460f12c8aa6915179a0b1bfd9aa2876e58ea5e64b70be111b9"} Oct 01 14:01:21 crc kubenswrapper[4749]: I1001 14:01:21.938333 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:01:23 crc kubenswrapper[4749]: I1001 14:01:23.954174 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tggkl" event={"ID":"bd49ae45-1a35-47d4-953f-7d8294b339e3","Type":"ContainerStarted","Data":"a8cd771dcbb3bc4d73a6880c1a129a01efb8c1263da664b265000c4a47baab50"} Oct 01 14:01:25 crc kubenswrapper[4749]: I1001 14:01:25.973938 4749 generic.go:334] "Generic (PLEG): container finished" podID="bd49ae45-1a35-47d4-953f-7d8294b339e3" containerID="a8cd771dcbb3bc4d73a6880c1a129a01efb8c1263da664b265000c4a47baab50" exitCode=0 Oct 01 14:01:25 crc kubenswrapper[4749]: I1001 14:01:25.974022 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tggkl" event={"ID":"bd49ae45-1a35-47d4-953f-7d8294b339e3","Type":"ContainerDied","Data":"a8cd771dcbb3bc4d73a6880c1a129a01efb8c1263da664b265000c4a47baab50"} Oct 01 14:01:26 crc kubenswrapper[4749]: I1001 14:01:26.987976 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tggkl" event={"ID":"bd49ae45-1a35-47d4-953f-7d8294b339e3","Type":"ContainerStarted","Data":"030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67"} Oct 01 14:01:27 crc kubenswrapper[4749]: I1001 14:01:27.018618 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tggkl" podStartSLOduration=3.413382812 podStartE2EDuration="8.018599762s" podCreationTimestamp="2025-10-01 14:01:19 +0000 UTC" firstStartedPulling="2025-10-01 14:01:21.938088044 +0000 UTC m=+3341.992072943" lastFinishedPulling="2025-10-01 14:01:26.543304974 +0000 UTC m=+3346.597289893" observedRunningTime="2025-10-01 14:01:27.01434826 +0000 UTC m=+3347.068333159" watchObservedRunningTime="2025-10-01 14:01:27.018599762 +0000 UTC m=+3347.072584661" Oct 01 14:01:30 crc kubenswrapper[4749]: I1001 14:01:30.313322 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:30 crc kubenswrapper[4749]: I1001 14:01:30.313720 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:31 crc kubenswrapper[4749]: I1001 14:01:31.377416 4749 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tggkl" podUID="bd49ae45-1a35-47d4-953f-7d8294b339e3" containerName="registry-server" probeResult="failure" output=< Oct 01 14:01:31 crc kubenswrapper[4749]: timeout: failed to connect service ":50051" within 1s Oct 01 14:01:31 crc kubenswrapper[4749]: > Oct 01 14:01:32 crc kubenswrapper[4749]: I1001 14:01:32.106025 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:01:32 crc kubenswrapper[4749]: I1001 14:01:32.106088 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:01:40 crc kubenswrapper[4749]: I1001 14:01:40.363273 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:40 crc kubenswrapper[4749]: I1001 14:01:40.418628 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:41 crc kubenswrapper[4749]: I1001 14:01:41.254249 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tggkl"] Oct 01 14:01:42 crc kubenswrapper[4749]: I1001 14:01:42.124337 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tggkl" podUID="bd49ae45-1a35-47d4-953f-7d8294b339e3" containerName="registry-server" containerID="cri-o://030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67" gracePeriod=2 Oct 01 14:01:42 crc kubenswrapper[4749]: I1001 14:01:42.646547 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:42 crc kubenswrapper[4749]: I1001 14:01:42.694486 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpvkf\" (UniqueName: \"kubernetes.io/projected/bd49ae45-1a35-47d4-953f-7d8294b339e3-kube-api-access-jpvkf\") pod \"bd49ae45-1a35-47d4-953f-7d8294b339e3\" (UID: \"bd49ae45-1a35-47d4-953f-7d8294b339e3\") " Oct 01 14:01:42 crc kubenswrapper[4749]: I1001 14:01:42.694587 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd49ae45-1a35-47d4-953f-7d8294b339e3-catalog-content\") pod \"bd49ae45-1a35-47d4-953f-7d8294b339e3\" (UID: \"bd49ae45-1a35-47d4-953f-7d8294b339e3\") " Oct 01 14:01:42 crc kubenswrapper[4749]: I1001 14:01:42.694637 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd49ae45-1a35-47d4-953f-7d8294b339e3-utilities\") pod \"bd49ae45-1a35-47d4-953f-7d8294b339e3\" (UID: \"bd49ae45-1a35-47d4-953f-7d8294b339e3\") " Oct 01 14:01:42 crc kubenswrapper[4749]: I1001 14:01:42.695651 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd49ae45-1a35-47d4-953f-7d8294b339e3-utilities" (OuterVolumeSpecName: "utilities") pod "bd49ae45-1a35-47d4-953f-7d8294b339e3" (UID: "bd49ae45-1a35-47d4-953f-7d8294b339e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:01:42 crc kubenswrapper[4749]: I1001 14:01:42.700340 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd49ae45-1a35-47d4-953f-7d8294b339e3-kube-api-access-jpvkf" (OuterVolumeSpecName: "kube-api-access-jpvkf") pod "bd49ae45-1a35-47d4-953f-7d8294b339e3" (UID: "bd49ae45-1a35-47d4-953f-7d8294b339e3"). InnerVolumeSpecName "kube-api-access-jpvkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:01:42 crc kubenswrapper[4749]: I1001 14:01:42.773359 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd49ae45-1a35-47d4-953f-7d8294b339e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd49ae45-1a35-47d4-953f-7d8294b339e3" (UID: "bd49ae45-1a35-47d4-953f-7d8294b339e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:01:42 crc kubenswrapper[4749]: I1001 14:01:42.796520 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpvkf\" (UniqueName: \"kubernetes.io/projected/bd49ae45-1a35-47d4-953f-7d8294b339e3-kube-api-access-jpvkf\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:42 crc kubenswrapper[4749]: I1001 14:01:42.796553 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd49ae45-1a35-47d4-953f-7d8294b339e3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:42 crc kubenswrapper[4749]: I1001 14:01:42.796563 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd49ae45-1a35-47d4-953f-7d8294b339e3-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.136122 4749 generic.go:334] "Generic (PLEG): container finished" podID="bd49ae45-1a35-47d4-953f-7d8294b339e3" containerID="030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67" exitCode=0 Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.136174 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tggkl" event={"ID":"bd49ae45-1a35-47d4-953f-7d8294b339e3","Type":"ContainerDied","Data":"030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67"} Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.136192 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tggkl" Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.136208 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tggkl" event={"ID":"bd49ae45-1a35-47d4-953f-7d8294b339e3","Type":"ContainerDied","Data":"8b38ed625ea3605fe14543ccd87a6aec46dd9e8da1cd0ab89e4331d980ea3574"} Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.136242 4749 scope.go:117] "RemoveContainer" containerID="030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67" Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.175623 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tggkl"] Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.177375 4749 scope.go:117] "RemoveContainer" containerID="a8cd771dcbb3bc4d73a6880c1a129a01efb8c1263da664b265000c4a47baab50" Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.184773 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tggkl"] Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.208634 4749 scope.go:117] "RemoveContainer" containerID="905fbca5949e0a460f12c8aa6915179a0b1bfd9aa2876e58ea5e64b70be111b9" Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.244432 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd49ae45-1a35-47d4-953f-7d8294b339e3" path="/var/lib/kubelet/pods/bd49ae45-1a35-47d4-953f-7d8294b339e3/volumes" Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.248864 4749 scope.go:117] "RemoveContainer" containerID="030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67" Oct 01 14:01:43 crc kubenswrapper[4749]: E1001 14:01:43.249930 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67\": container with ID starting with 030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67 not found: ID does not exist" containerID="030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67" Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.250003 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67"} err="failed to get container status \"030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67\": rpc error: code = NotFound desc = could not find container \"030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67\": container with ID starting with 030ace9bf5d61225171cc4245bb34e1eb2c6a8c75ead3fdcad15dd717b522f67 not found: ID does not exist" Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.250032 4749 scope.go:117] "RemoveContainer" containerID="a8cd771dcbb3bc4d73a6880c1a129a01efb8c1263da664b265000c4a47baab50" Oct 01 14:01:43 crc kubenswrapper[4749]: E1001 14:01:43.250459 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8cd771dcbb3bc4d73a6880c1a129a01efb8c1263da664b265000c4a47baab50\": container with ID starting with a8cd771dcbb3bc4d73a6880c1a129a01efb8c1263da664b265000c4a47baab50 not found: ID does not exist" containerID="a8cd771dcbb3bc4d73a6880c1a129a01efb8c1263da664b265000c4a47baab50" Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.250497 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8cd771dcbb3bc4d73a6880c1a129a01efb8c1263da664b265000c4a47baab50"} err="failed to get container status \"a8cd771dcbb3bc4d73a6880c1a129a01efb8c1263da664b265000c4a47baab50\": rpc error: code = NotFound desc = could not find container \"a8cd771dcbb3bc4d73a6880c1a129a01efb8c1263da664b265000c4a47baab50\": container with ID starting with a8cd771dcbb3bc4d73a6880c1a129a01efb8c1263da664b265000c4a47baab50 not found: ID does not exist" Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.250524 4749 scope.go:117] "RemoveContainer" containerID="905fbca5949e0a460f12c8aa6915179a0b1bfd9aa2876e58ea5e64b70be111b9" Oct 01 14:01:43 crc kubenswrapper[4749]: E1001 14:01:43.250861 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"905fbca5949e0a460f12c8aa6915179a0b1bfd9aa2876e58ea5e64b70be111b9\": container with ID starting with 905fbca5949e0a460f12c8aa6915179a0b1bfd9aa2876e58ea5e64b70be111b9 not found: ID does not exist" containerID="905fbca5949e0a460f12c8aa6915179a0b1bfd9aa2876e58ea5e64b70be111b9" Oct 01 14:01:43 crc kubenswrapper[4749]: I1001 14:01:43.250884 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"905fbca5949e0a460f12c8aa6915179a0b1bfd9aa2876e58ea5e64b70be111b9"} err="failed to get container status \"905fbca5949e0a460f12c8aa6915179a0b1bfd9aa2876e58ea5e64b70be111b9\": rpc error: code = NotFound desc = could not find container \"905fbca5949e0a460f12c8aa6915179a0b1bfd9aa2876e58ea5e64b70be111b9\": container with ID starting with 905fbca5949e0a460f12c8aa6915179a0b1bfd9aa2876e58ea5e64b70be111b9 not found: ID does not exist" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.642604 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8nfcw"] Oct 01 14:01:56 crc kubenswrapper[4749]: E1001 14:01:56.643607 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd49ae45-1a35-47d4-953f-7d8294b339e3" containerName="extract-content" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.643623 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd49ae45-1a35-47d4-953f-7d8294b339e3" containerName="extract-content" Oct 01 14:01:56 crc kubenswrapper[4749]: E1001 14:01:56.643635 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd49ae45-1a35-47d4-953f-7d8294b339e3" containerName="registry-server" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.643642 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd49ae45-1a35-47d4-953f-7d8294b339e3" containerName="registry-server" Oct 01 14:01:56 crc kubenswrapper[4749]: E1001 14:01:56.643663 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd49ae45-1a35-47d4-953f-7d8294b339e3" containerName="extract-utilities" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.643673 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd49ae45-1a35-47d4-953f-7d8294b339e3" containerName="extract-utilities" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.643941 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd49ae45-1a35-47d4-953f-7d8294b339e3" containerName="registry-server" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.646273 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.670736 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8nfcw"] Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.744474 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjms9\" (UniqueName: \"kubernetes.io/projected/f6e2c9d3-03b5-43ca-beb3-609de31493de-kube-api-access-kjms9\") pod \"certified-operators-8nfcw\" (UID: \"f6e2c9d3-03b5-43ca-beb3-609de31493de\") " pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.744669 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c9d3-03b5-43ca-beb3-609de31493de-catalog-content\") pod \"certified-operators-8nfcw\" (UID: \"f6e2c9d3-03b5-43ca-beb3-609de31493de\") " pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.744711 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c9d3-03b5-43ca-beb3-609de31493de-utilities\") pod \"certified-operators-8nfcw\" (UID: \"f6e2c9d3-03b5-43ca-beb3-609de31493de\") " pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.845998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c9d3-03b5-43ca-beb3-609de31493de-utilities\") pod \"certified-operators-8nfcw\" (UID: \"f6e2c9d3-03b5-43ca-beb3-609de31493de\") " pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.846120 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjms9\" (UniqueName: \"kubernetes.io/projected/f6e2c9d3-03b5-43ca-beb3-609de31493de-kube-api-access-kjms9\") pod \"certified-operators-8nfcw\" (UID: \"f6e2c9d3-03b5-43ca-beb3-609de31493de\") " pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.846291 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c9d3-03b5-43ca-beb3-609de31493de-catalog-content\") pod \"certified-operators-8nfcw\" (UID: \"f6e2c9d3-03b5-43ca-beb3-609de31493de\") " pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.846618 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c9d3-03b5-43ca-beb3-609de31493de-utilities\") pod \"certified-operators-8nfcw\" (UID: \"f6e2c9d3-03b5-43ca-beb3-609de31493de\") " pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.846743 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c9d3-03b5-43ca-beb3-609de31493de-catalog-content\") pod \"certified-operators-8nfcw\" (UID: \"f6e2c9d3-03b5-43ca-beb3-609de31493de\") " pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.881805 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjms9\" (UniqueName: \"kubernetes.io/projected/f6e2c9d3-03b5-43ca-beb3-609de31493de-kube-api-access-kjms9\") pod \"certified-operators-8nfcw\" (UID: \"f6e2c9d3-03b5-43ca-beb3-609de31493de\") " pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:01:56 crc kubenswrapper[4749]: I1001 14:01:56.980972 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:01:57 crc kubenswrapper[4749]: I1001 14:01:57.503495 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8nfcw"] Oct 01 14:01:58 crc kubenswrapper[4749]: I1001 14:01:58.289731 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6e2c9d3-03b5-43ca-beb3-609de31493de" containerID="5c265fa8bbaa9d193c9c5412dd797ce8ea5ecece7c3f829eb3644f87b8805f2a" exitCode=0 Oct 01 14:01:58 crc kubenswrapper[4749]: I1001 14:01:58.289779 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nfcw" event={"ID":"f6e2c9d3-03b5-43ca-beb3-609de31493de","Type":"ContainerDied","Data":"5c265fa8bbaa9d193c9c5412dd797ce8ea5ecece7c3f829eb3644f87b8805f2a"} Oct 01 14:01:58 crc kubenswrapper[4749]: I1001 14:01:58.290067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nfcw" event={"ID":"f6e2c9d3-03b5-43ca-beb3-609de31493de","Type":"ContainerStarted","Data":"897ab1db61c0f0fb2c8cfaaa67a1f85a34c7f84befecfda1a0a4475259050eb8"} Oct 01 14:01:59 crc kubenswrapper[4749]: I1001 14:01:59.302658 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nfcw" event={"ID":"f6e2c9d3-03b5-43ca-beb3-609de31493de","Type":"ContainerStarted","Data":"da9df71d11b167a24ea940cb84bd3d821033c40fe4c486f6de17642cf7ca7481"} Oct 01 14:02:00 crc kubenswrapper[4749]: I1001 14:02:00.324763 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6e2c9d3-03b5-43ca-beb3-609de31493de" containerID="da9df71d11b167a24ea940cb84bd3d821033c40fe4c486f6de17642cf7ca7481" exitCode=0 Oct 01 14:02:00 crc kubenswrapper[4749]: I1001 14:02:00.325005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nfcw" event={"ID":"f6e2c9d3-03b5-43ca-beb3-609de31493de","Type":"ContainerDied","Data":"da9df71d11b167a24ea940cb84bd3d821033c40fe4c486f6de17642cf7ca7481"} Oct 01 14:02:01 crc kubenswrapper[4749]: I1001 14:02:01.340325 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nfcw" event={"ID":"f6e2c9d3-03b5-43ca-beb3-609de31493de","Type":"ContainerStarted","Data":"e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb"} Oct 01 14:02:01 crc kubenswrapper[4749]: I1001 14:02:01.364980 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8nfcw" podStartSLOduration=2.853045486 podStartE2EDuration="5.364958208s" podCreationTimestamp="2025-10-01 14:01:56 +0000 UTC" firstStartedPulling="2025-10-01 14:01:58.291065647 +0000 UTC m=+3378.345050546" lastFinishedPulling="2025-10-01 14:02:00.802978359 +0000 UTC m=+3380.856963268" observedRunningTime="2025-10-01 14:02:01.357244016 +0000 UTC m=+3381.411228915" watchObservedRunningTime="2025-10-01 14:02:01.364958208 +0000 UTC m=+3381.418943107" Oct 01 14:02:02 crc kubenswrapper[4749]: I1001 14:02:02.106029 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:02:02 crc kubenswrapper[4749]: I1001 14:02:02.106392 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:02:06 crc kubenswrapper[4749]: I1001 14:02:06.981166 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:02:06 crc kubenswrapper[4749]: I1001 14:02:06.981889 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:02:07 crc kubenswrapper[4749]: I1001 14:02:07.038361 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:02:07 crc kubenswrapper[4749]: I1001 14:02:07.478420 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:02:07 crc kubenswrapper[4749]: I1001 14:02:07.526115 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8nfcw"] Oct 01 14:02:09 crc kubenswrapper[4749]: I1001 14:02:09.432943 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8nfcw" podUID="f6e2c9d3-03b5-43ca-beb3-609de31493de" containerName="registry-server" containerID="cri-o://e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb" gracePeriod=2 Oct 01 14:02:09 crc kubenswrapper[4749]: I1001 14:02:09.886840 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.033187 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c9d3-03b5-43ca-beb3-609de31493de-catalog-content\") pod \"f6e2c9d3-03b5-43ca-beb3-609de31493de\" (UID: \"f6e2c9d3-03b5-43ca-beb3-609de31493de\") " Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.033307 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c9d3-03b5-43ca-beb3-609de31493de-utilities\") pod \"f6e2c9d3-03b5-43ca-beb3-609de31493de\" (UID: \"f6e2c9d3-03b5-43ca-beb3-609de31493de\") " Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.033503 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjms9\" (UniqueName: \"kubernetes.io/projected/f6e2c9d3-03b5-43ca-beb3-609de31493de-kube-api-access-kjms9\") pod \"f6e2c9d3-03b5-43ca-beb3-609de31493de\" (UID: \"f6e2c9d3-03b5-43ca-beb3-609de31493de\") " Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.034397 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6e2c9d3-03b5-43ca-beb3-609de31493de-utilities" (OuterVolumeSpecName: "utilities") pod "f6e2c9d3-03b5-43ca-beb3-609de31493de" (UID: "f6e2c9d3-03b5-43ca-beb3-609de31493de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.038763 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6e2c9d3-03b5-43ca-beb3-609de31493de-kube-api-access-kjms9" (OuterVolumeSpecName: "kube-api-access-kjms9") pod "f6e2c9d3-03b5-43ca-beb3-609de31493de" (UID: "f6e2c9d3-03b5-43ca-beb3-609de31493de"). InnerVolumeSpecName "kube-api-access-kjms9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.135976 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c9d3-03b5-43ca-beb3-609de31493de-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.136026 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjms9\" (UniqueName: \"kubernetes.io/projected/f6e2c9d3-03b5-43ca-beb3-609de31493de-kube-api-access-kjms9\") on node \"crc\" DevicePath \"\"" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.447169 4749 generic.go:334] "Generic (PLEG): container finished" podID="f6e2c9d3-03b5-43ca-beb3-609de31493de" containerID="e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb" exitCode=0 Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.447235 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nfcw" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.447257 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nfcw" event={"ID":"f6e2c9d3-03b5-43ca-beb3-609de31493de","Type":"ContainerDied","Data":"e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb"} Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.447721 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nfcw" event={"ID":"f6e2c9d3-03b5-43ca-beb3-609de31493de","Type":"ContainerDied","Data":"897ab1db61c0f0fb2c8cfaaa67a1f85a34c7f84befecfda1a0a4475259050eb8"} Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.447756 4749 scope.go:117] "RemoveContainer" containerID="e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.472066 4749 scope.go:117] "RemoveContainer" containerID="da9df71d11b167a24ea940cb84bd3d821033c40fe4c486f6de17642cf7ca7481" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.493967 4749 scope.go:117] "RemoveContainer" containerID="5c265fa8bbaa9d193c9c5412dd797ce8ea5ecece7c3f829eb3644f87b8805f2a" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.530424 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6e2c9d3-03b5-43ca-beb3-609de31493de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6e2c9d3-03b5-43ca-beb3-609de31493de" (UID: "f6e2c9d3-03b5-43ca-beb3-609de31493de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.543167 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6e2c9d3-03b5-43ca-beb3-609de31493de-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.546885 4749 scope.go:117] "RemoveContainer" containerID="e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb" Oct 01 14:02:10 crc kubenswrapper[4749]: E1001 14:02:10.547442 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb\": container with ID starting with e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb not found: ID does not exist" containerID="e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.547482 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb"} err="failed to get container status \"e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb\": rpc error: code = NotFound desc = could not find container \"e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb\": container with ID starting with e0a9a8c7a97f418272ced07d0ee9faff4a03163b4f16c0df1f23f7ad601a12cb not found: ID does not exist" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.547508 4749 scope.go:117] "RemoveContainer" containerID="da9df71d11b167a24ea940cb84bd3d821033c40fe4c486f6de17642cf7ca7481" Oct 01 14:02:10 crc kubenswrapper[4749]: E1001 14:02:10.547789 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da9df71d11b167a24ea940cb84bd3d821033c40fe4c486f6de17642cf7ca7481\": container with ID starting with da9df71d11b167a24ea940cb84bd3d821033c40fe4c486f6de17642cf7ca7481 not found: ID does not exist" containerID="da9df71d11b167a24ea940cb84bd3d821033c40fe4c486f6de17642cf7ca7481" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.547844 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da9df71d11b167a24ea940cb84bd3d821033c40fe4c486f6de17642cf7ca7481"} err="failed to get container status \"da9df71d11b167a24ea940cb84bd3d821033c40fe4c486f6de17642cf7ca7481\": rpc error: code = NotFound desc = could not find container \"da9df71d11b167a24ea940cb84bd3d821033c40fe4c486f6de17642cf7ca7481\": container with ID starting with da9df71d11b167a24ea940cb84bd3d821033c40fe4c486f6de17642cf7ca7481 not found: ID does not exist" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.547891 4749 scope.go:117] "RemoveContainer" containerID="5c265fa8bbaa9d193c9c5412dd797ce8ea5ecece7c3f829eb3644f87b8805f2a" Oct 01 14:02:10 crc kubenswrapper[4749]: E1001 14:02:10.548273 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c265fa8bbaa9d193c9c5412dd797ce8ea5ecece7c3f829eb3644f87b8805f2a\": container with ID starting with 5c265fa8bbaa9d193c9c5412dd797ce8ea5ecece7c3f829eb3644f87b8805f2a not found: ID does not exist" containerID="5c265fa8bbaa9d193c9c5412dd797ce8ea5ecece7c3f829eb3644f87b8805f2a" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.548415 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c265fa8bbaa9d193c9c5412dd797ce8ea5ecece7c3f829eb3644f87b8805f2a"} err="failed to get container status \"5c265fa8bbaa9d193c9c5412dd797ce8ea5ecece7c3f829eb3644f87b8805f2a\": rpc error: code = NotFound desc = could not find container \"5c265fa8bbaa9d193c9c5412dd797ce8ea5ecece7c3f829eb3644f87b8805f2a\": container with ID starting with 5c265fa8bbaa9d193c9c5412dd797ce8ea5ecece7c3f829eb3644f87b8805f2a not found: ID does not exist" Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.786531 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8nfcw"] Oct 01 14:02:10 crc kubenswrapper[4749]: I1001 14:02:10.802387 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8nfcw"] Oct 01 14:02:11 crc kubenswrapper[4749]: I1001 14:02:11.249911 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6e2c9d3-03b5-43ca-beb3-609de31493de" path="/var/lib/kubelet/pods/f6e2c9d3-03b5-43ca-beb3-609de31493de/volumes" Oct 01 14:02:32 crc kubenswrapper[4749]: I1001 14:02:32.106285 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:02:32 crc kubenswrapper[4749]: I1001 14:02:32.107002 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:02:32 crc kubenswrapper[4749]: I1001 14:02:32.107075 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 14:02:32 crc kubenswrapper[4749]: I1001 14:02:32.108051 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ab6ef42cbe1670573b773f73a3eec3d9985fa70a7036319c387479a68495856"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:02:32 crc kubenswrapper[4749]: I1001 14:02:32.108174 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://5ab6ef42cbe1670573b773f73a3eec3d9985fa70a7036319c387479a68495856" gracePeriod=600 Oct 01 14:02:32 crc kubenswrapper[4749]: I1001 14:02:32.730568 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="5ab6ef42cbe1670573b773f73a3eec3d9985fa70a7036319c387479a68495856" exitCode=0 Oct 01 14:02:32 crc kubenswrapper[4749]: I1001 14:02:32.730670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"5ab6ef42cbe1670573b773f73a3eec3d9985fa70a7036319c387479a68495856"} Oct 01 14:02:32 crc kubenswrapper[4749]: I1001 14:02:32.731155 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda"} Oct 01 14:02:32 crc kubenswrapper[4749]: I1001 14:02:32.731178 4749 scope.go:117] "RemoveContainer" containerID="1d9f57fbc5dcf4b4980c129cff2c7cc195c198e06ae2412f58212cbc9a8f1082" Oct 01 14:04:32 crc kubenswrapper[4749]: I1001 14:04:32.107016 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:04:32 crc kubenswrapper[4749]: I1001 14:04:32.107610 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:05:02 crc kubenswrapper[4749]: I1001 14:05:02.106445 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:05:02 crc kubenswrapper[4749]: I1001 14:05:02.107068 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:05:32 crc kubenswrapper[4749]: I1001 14:05:32.106421 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:05:32 crc kubenswrapper[4749]: I1001 14:05:32.107010 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:05:32 crc kubenswrapper[4749]: I1001 14:05:32.107051 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 14:05:32 crc kubenswrapper[4749]: I1001 14:05:32.107781 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:05:32 crc kubenswrapper[4749]: I1001 14:05:32.107833 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" gracePeriod=600 Oct 01 14:05:32 crc kubenswrapper[4749]: E1001 14:05:32.225822 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:05:32 crc kubenswrapper[4749]: I1001 14:05:32.576421 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" exitCode=0 Oct 01 14:05:32 crc kubenswrapper[4749]: I1001 14:05:32.576477 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda"} Oct 01 14:05:32 crc kubenswrapper[4749]: I1001 14:05:32.576528 4749 scope.go:117] "RemoveContainer" containerID="5ab6ef42cbe1670573b773f73a3eec3d9985fa70a7036319c387479a68495856" Oct 01 14:05:32 crc kubenswrapper[4749]: I1001 14:05:32.577193 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:05:32 crc kubenswrapper[4749]: E1001 14:05:32.577520 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:05:48 crc kubenswrapper[4749]: I1001 14:05:48.230354 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:05:48 crc kubenswrapper[4749]: E1001 14:05:48.231300 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:06:03 crc kubenswrapper[4749]: I1001 14:06:03.230808 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:06:03 crc kubenswrapper[4749]: E1001 14:06:03.231822 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:06:18 crc kubenswrapper[4749]: I1001 14:06:18.230342 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:06:18 crc kubenswrapper[4749]: E1001 14:06:18.231389 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:06:31 crc kubenswrapper[4749]: I1001 14:06:31.236920 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:06:31 crc kubenswrapper[4749]: E1001 14:06:31.237684 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:06:40 crc kubenswrapper[4749]: E1001 14:06:40.654743 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.220:41774->38.102.83.220:34693: write tcp 38.102.83.220:41774->38.102.83.220:34693: write: broken pipe Oct 01 14:06:44 crc kubenswrapper[4749]: I1001 14:06:44.230910 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:06:44 crc kubenswrapper[4749]: E1001 14:06:44.231785 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:06:59 crc kubenswrapper[4749]: I1001 14:06:59.230549 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:06:59 crc kubenswrapper[4749]: E1001 14:06:59.231419 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:07:11 crc kubenswrapper[4749]: I1001 14:07:11.236042 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:07:11 crc kubenswrapper[4749]: E1001 14:07:11.236836 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:07:22 crc kubenswrapper[4749]: I1001 14:07:22.230484 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:07:22 crc kubenswrapper[4749]: E1001 14:07:22.232437 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:07:33 crc kubenswrapper[4749]: I1001 14:07:33.229512 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:07:33 crc kubenswrapper[4749]: E1001 14:07:33.230141 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:07:48 crc kubenswrapper[4749]: I1001 14:07:48.230315 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:07:48 crc kubenswrapper[4749]: E1001 14:07:48.230990 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:08:03 crc kubenswrapper[4749]: I1001 14:08:03.229605 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:08:03 crc kubenswrapper[4749]: E1001 14:08:03.230348 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:08:17 crc kubenswrapper[4749]: I1001 14:08:17.229689 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:08:17 crc kubenswrapper[4749]: E1001 14:08:17.230503 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:08:30 crc kubenswrapper[4749]: I1001 14:08:30.231086 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:08:30 crc kubenswrapper[4749]: E1001 14:08:30.232024 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:08:41 crc kubenswrapper[4749]: I1001 14:08:41.251266 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:08:41 crc kubenswrapper[4749]: E1001 14:08:41.252326 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.279197 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zlgt6"] Oct 01 14:08:53 crc kubenswrapper[4749]: E1001 14:08:53.280203 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e2c9d3-03b5-43ca-beb3-609de31493de" containerName="registry-server" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.280237 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e2c9d3-03b5-43ca-beb3-609de31493de" containerName="registry-server" Oct 01 14:08:53 crc kubenswrapper[4749]: E1001 14:08:53.280257 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e2c9d3-03b5-43ca-beb3-609de31493de" containerName="extract-content" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.280264 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e2c9d3-03b5-43ca-beb3-609de31493de" containerName="extract-content" Oct 01 14:08:53 crc kubenswrapper[4749]: E1001 14:08:53.280283 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e2c9d3-03b5-43ca-beb3-609de31493de" containerName="extract-utilities" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.280289 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e2c9d3-03b5-43ca-beb3-609de31493de" containerName="extract-utilities" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.280491 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6e2c9d3-03b5-43ca-beb3-609de31493de" containerName="registry-server" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.282061 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.288265 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zlgt6"] Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.306039 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20462654-ee50-4ee5-b117-fa1c16a048f1-utilities\") pod \"community-operators-zlgt6\" (UID: \"20462654-ee50-4ee5-b117-fa1c16a048f1\") " pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.306169 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20462654-ee50-4ee5-b117-fa1c16a048f1-catalog-content\") pod \"community-operators-zlgt6\" (UID: \"20462654-ee50-4ee5-b117-fa1c16a048f1\") " pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.306365 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqvfx\" (UniqueName: \"kubernetes.io/projected/20462654-ee50-4ee5-b117-fa1c16a048f1-kube-api-access-hqvfx\") pod \"community-operators-zlgt6\" (UID: \"20462654-ee50-4ee5-b117-fa1c16a048f1\") " pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.407823 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqvfx\" (UniqueName: \"kubernetes.io/projected/20462654-ee50-4ee5-b117-fa1c16a048f1-kube-api-access-hqvfx\") pod \"community-operators-zlgt6\" (UID: \"20462654-ee50-4ee5-b117-fa1c16a048f1\") " pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.407912 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20462654-ee50-4ee5-b117-fa1c16a048f1-utilities\") pod \"community-operators-zlgt6\" (UID: \"20462654-ee50-4ee5-b117-fa1c16a048f1\") " pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.407998 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20462654-ee50-4ee5-b117-fa1c16a048f1-catalog-content\") pod \"community-operators-zlgt6\" (UID: \"20462654-ee50-4ee5-b117-fa1c16a048f1\") " pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.408450 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20462654-ee50-4ee5-b117-fa1c16a048f1-utilities\") pod \"community-operators-zlgt6\" (UID: \"20462654-ee50-4ee5-b117-fa1c16a048f1\") " pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.408586 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20462654-ee50-4ee5-b117-fa1c16a048f1-catalog-content\") pod \"community-operators-zlgt6\" (UID: \"20462654-ee50-4ee5-b117-fa1c16a048f1\") " pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.431006 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqvfx\" (UniqueName: \"kubernetes.io/projected/20462654-ee50-4ee5-b117-fa1c16a048f1-kube-api-access-hqvfx\") pod \"community-operators-zlgt6\" (UID: \"20462654-ee50-4ee5-b117-fa1c16a048f1\") " pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.604763 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.869437 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-459p5"] Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.872781 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.878067 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-459p5"] Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.917709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47zrz\" (UniqueName: \"kubernetes.io/projected/548656b3-50ec-4c25-9cb8-067406a98efa-kube-api-access-47zrz\") pod \"redhat-marketplace-459p5\" (UID: \"548656b3-50ec-4c25-9cb8-067406a98efa\") " pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.917803 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/548656b3-50ec-4c25-9cb8-067406a98efa-utilities\") pod \"redhat-marketplace-459p5\" (UID: \"548656b3-50ec-4c25-9cb8-067406a98efa\") " pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:08:53 crc kubenswrapper[4749]: I1001 14:08:53.918005 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/548656b3-50ec-4c25-9cb8-067406a98efa-catalog-content\") pod \"redhat-marketplace-459p5\" (UID: \"548656b3-50ec-4c25-9cb8-067406a98efa\") " pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:08:54 crc kubenswrapper[4749]: I1001 14:08:54.020180 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/548656b3-50ec-4c25-9cb8-067406a98efa-utilities\") pod \"redhat-marketplace-459p5\" (UID: \"548656b3-50ec-4c25-9cb8-067406a98efa\") " pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:08:54 crc kubenswrapper[4749]: I1001 14:08:54.020288 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/548656b3-50ec-4c25-9cb8-067406a98efa-catalog-content\") pod \"redhat-marketplace-459p5\" (UID: \"548656b3-50ec-4c25-9cb8-067406a98efa\") " pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:08:54 crc kubenswrapper[4749]: I1001 14:08:54.020410 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47zrz\" (UniqueName: \"kubernetes.io/projected/548656b3-50ec-4c25-9cb8-067406a98efa-kube-api-access-47zrz\") pod \"redhat-marketplace-459p5\" (UID: \"548656b3-50ec-4c25-9cb8-067406a98efa\") " pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:08:54 crc kubenswrapper[4749]: I1001 14:08:54.020626 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/548656b3-50ec-4c25-9cb8-067406a98efa-utilities\") pod \"redhat-marketplace-459p5\" (UID: \"548656b3-50ec-4c25-9cb8-067406a98efa\") " pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:08:54 crc kubenswrapper[4749]: I1001 14:08:54.020753 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/548656b3-50ec-4c25-9cb8-067406a98efa-catalog-content\") pod \"redhat-marketplace-459p5\" (UID: \"548656b3-50ec-4c25-9cb8-067406a98efa\") " pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:08:54 crc kubenswrapper[4749]: I1001 14:08:54.136439 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47zrz\" (UniqueName: \"kubernetes.io/projected/548656b3-50ec-4c25-9cb8-067406a98efa-kube-api-access-47zrz\") pod \"redhat-marketplace-459p5\" (UID: \"548656b3-50ec-4c25-9cb8-067406a98efa\") " pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:08:54 crc kubenswrapper[4749]: I1001 14:08:54.195829 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:08:54 crc kubenswrapper[4749]: I1001 14:08:54.230442 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:08:54 crc kubenswrapper[4749]: E1001 14:08:54.230773 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:08:54 crc kubenswrapper[4749]: I1001 14:08:54.697254 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zlgt6"] Oct 01 14:08:54 crc kubenswrapper[4749]: I1001 14:08:54.709486 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-459p5"] Oct 01 14:08:55 crc kubenswrapper[4749]: I1001 14:08:55.563309 4749 generic.go:334] "Generic (PLEG): container finished" podID="20462654-ee50-4ee5-b117-fa1c16a048f1" containerID="0e25c06c33f6d71aa9f530ee71a5f112a6616b5a2071e1e7dd109cbf1ee77df8" exitCode=0 Oct 01 14:08:55 crc kubenswrapper[4749]: I1001 14:08:55.563598 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlgt6" event={"ID":"20462654-ee50-4ee5-b117-fa1c16a048f1","Type":"ContainerDied","Data":"0e25c06c33f6d71aa9f530ee71a5f112a6616b5a2071e1e7dd109cbf1ee77df8"} Oct 01 14:08:55 crc kubenswrapper[4749]: I1001 14:08:55.563621 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlgt6" event={"ID":"20462654-ee50-4ee5-b117-fa1c16a048f1","Type":"ContainerStarted","Data":"e3a63cc4ee8e74b52a72d79d491f99c5b05fd00dedd5da165f4218d93f61d0ff"} Oct 01 14:08:55 crc kubenswrapper[4749]: I1001 14:08:55.565792 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:08:55 crc kubenswrapper[4749]: I1001 14:08:55.566018 4749 generic.go:334] "Generic (PLEG): container finished" podID="548656b3-50ec-4c25-9cb8-067406a98efa" containerID="433d556546dc5772c58126d3195a4a83d04d5073e86a2102e24c6638549f1756" exitCode=0 Oct 01 14:08:55 crc kubenswrapper[4749]: I1001 14:08:55.566130 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-459p5" event={"ID":"548656b3-50ec-4c25-9cb8-067406a98efa","Type":"ContainerDied","Data":"433d556546dc5772c58126d3195a4a83d04d5073e86a2102e24c6638549f1756"} Oct 01 14:08:55 crc kubenswrapper[4749]: I1001 14:08:55.566240 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-459p5" event={"ID":"548656b3-50ec-4c25-9cb8-067406a98efa","Type":"ContainerStarted","Data":"0d0292ca94fef16c47e655b67f44447c959e3da995853e40759ba24e11599f18"} Oct 01 14:08:56 crc kubenswrapper[4749]: I1001 14:08:56.579613 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlgt6" event={"ID":"20462654-ee50-4ee5-b117-fa1c16a048f1","Type":"ContainerStarted","Data":"00d5ffb657bdd0d6d35d0157f33b35dce3e59ac4e124566b8c98f09c8797cdff"} Oct 01 14:08:56 crc kubenswrapper[4749]: I1001 14:08:56.584598 4749 generic.go:334] "Generic (PLEG): container finished" podID="548656b3-50ec-4c25-9cb8-067406a98efa" containerID="eda47a142f6817d8fe96a727d13a03796b2fd67ac15a573e4ee3ec99e18717e3" exitCode=0 Oct 01 14:08:56 crc kubenswrapper[4749]: I1001 14:08:56.584678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-459p5" event={"ID":"548656b3-50ec-4c25-9cb8-067406a98efa","Type":"ContainerDied","Data":"eda47a142f6817d8fe96a727d13a03796b2fd67ac15a573e4ee3ec99e18717e3"} Oct 01 14:08:57 crc kubenswrapper[4749]: I1001 14:08:57.609582 4749 generic.go:334] "Generic (PLEG): container finished" podID="20462654-ee50-4ee5-b117-fa1c16a048f1" containerID="00d5ffb657bdd0d6d35d0157f33b35dce3e59ac4e124566b8c98f09c8797cdff" exitCode=0 Oct 01 14:08:57 crc kubenswrapper[4749]: I1001 14:08:57.609863 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlgt6" event={"ID":"20462654-ee50-4ee5-b117-fa1c16a048f1","Type":"ContainerDied","Data":"00d5ffb657bdd0d6d35d0157f33b35dce3e59ac4e124566b8c98f09c8797cdff"} Oct 01 14:08:58 crc kubenswrapper[4749]: I1001 14:08:58.623031 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-459p5" event={"ID":"548656b3-50ec-4c25-9cb8-067406a98efa","Type":"ContainerStarted","Data":"3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb"} Oct 01 14:08:58 crc kubenswrapper[4749]: I1001 14:08:58.626670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlgt6" event={"ID":"20462654-ee50-4ee5-b117-fa1c16a048f1","Type":"ContainerStarted","Data":"140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710"} Oct 01 14:08:58 crc kubenswrapper[4749]: I1001 14:08:58.651117 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-459p5" podStartSLOduration=3.865692198 podStartE2EDuration="5.65109745s" podCreationTimestamp="2025-10-01 14:08:53 +0000 UTC" firstStartedPulling="2025-10-01 14:08:55.567088202 +0000 UTC m=+3795.621073101" lastFinishedPulling="2025-10-01 14:08:57.352493454 +0000 UTC m=+3797.406478353" observedRunningTime="2025-10-01 14:08:58.647566179 +0000 UTC m=+3798.701551088" watchObservedRunningTime="2025-10-01 14:08:58.65109745 +0000 UTC m=+3798.705082349" Oct 01 14:08:58 crc kubenswrapper[4749]: I1001 14:08:58.669716 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zlgt6" podStartSLOduration=3.119296853 podStartE2EDuration="5.669695847s" podCreationTimestamp="2025-10-01 14:08:53 +0000 UTC" firstStartedPulling="2025-10-01 14:08:55.565538538 +0000 UTC m=+3795.619523437" lastFinishedPulling="2025-10-01 14:08:58.115937532 +0000 UTC m=+3798.169922431" observedRunningTime="2025-10-01 14:08:58.663953571 +0000 UTC m=+3798.717938490" watchObservedRunningTime="2025-10-01 14:08:58.669695847 +0000 UTC m=+3798.723680746" Oct 01 14:09:03 crc kubenswrapper[4749]: I1001 14:09:03.605261 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:09:03 crc kubenswrapper[4749]: I1001 14:09:03.605847 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:09:03 crc kubenswrapper[4749]: I1001 14:09:03.670994 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:09:03 crc kubenswrapper[4749]: I1001 14:09:03.740506 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:09:03 crc kubenswrapper[4749]: I1001 14:09:03.908110 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zlgt6"] Oct 01 14:09:04 crc kubenswrapper[4749]: I1001 14:09:04.196021 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:09:04 crc kubenswrapper[4749]: I1001 14:09:04.196099 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:09:04 crc kubenswrapper[4749]: I1001 14:09:04.241053 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:09:04 crc kubenswrapper[4749]: I1001 14:09:04.740030 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:09:05 crc kubenswrapper[4749]: I1001 14:09:05.231625 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:09:05 crc kubenswrapper[4749]: E1001 14:09:05.232049 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:09:05 crc kubenswrapper[4749]: I1001 14:09:05.699721 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zlgt6" podUID="20462654-ee50-4ee5-b117-fa1c16a048f1" containerName="registry-server" containerID="cri-o://140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710" gracePeriod=2 Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.186314 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.311914 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-459p5"] Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.374744 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20462654-ee50-4ee5-b117-fa1c16a048f1-utilities\") pod \"20462654-ee50-4ee5-b117-fa1c16a048f1\" (UID: \"20462654-ee50-4ee5-b117-fa1c16a048f1\") " Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.374848 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqvfx\" (UniqueName: \"kubernetes.io/projected/20462654-ee50-4ee5-b117-fa1c16a048f1-kube-api-access-hqvfx\") pod \"20462654-ee50-4ee5-b117-fa1c16a048f1\" (UID: \"20462654-ee50-4ee5-b117-fa1c16a048f1\") " Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.374996 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20462654-ee50-4ee5-b117-fa1c16a048f1-catalog-content\") pod \"20462654-ee50-4ee5-b117-fa1c16a048f1\" (UID: \"20462654-ee50-4ee5-b117-fa1c16a048f1\") " Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.375831 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20462654-ee50-4ee5-b117-fa1c16a048f1-utilities" (OuterVolumeSpecName: "utilities") pod "20462654-ee50-4ee5-b117-fa1c16a048f1" (UID: "20462654-ee50-4ee5-b117-fa1c16a048f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.376973 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20462654-ee50-4ee5-b117-fa1c16a048f1-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.388336 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20462654-ee50-4ee5-b117-fa1c16a048f1-kube-api-access-hqvfx" (OuterVolumeSpecName: "kube-api-access-hqvfx") pod "20462654-ee50-4ee5-b117-fa1c16a048f1" (UID: "20462654-ee50-4ee5-b117-fa1c16a048f1"). InnerVolumeSpecName "kube-api-access-hqvfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.432345 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20462654-ee50-4ee5-b117-fa1c16a048f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20462654-ee50-4ee5-b117-fa1c16a048f1" (UID: "20462654-ee50-4ee5-b117-fa1c16a048f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.479110 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20462654-ee50-4ee5-b117-fa1c16a048f1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.479147 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqvfx\" (UniqueName: \"kubernetes.io/projected/20462654-ee50-4ee5-b117-fa1c16a048f1-kube-api-access-hqvfx\") on node \"crc\" DevicePath \"\"" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.720852 4749 generic.go:334] "Generic (PLEG): container finished" podID="20462654-ee50-4ee5-b117-fa1c16a048f1" containerID="140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710" exitCode=0 Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.721171 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-459p5" podUID="548656b3-50ec-4c25-9cb8-067406a98efa" containerName="registry-server" containerID="cri-o://3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb" gracePeriod=2 Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.721632 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlgt6" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.721694 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlgt6" event={"ID":"20462654-ee50-4ee5-b117-fa1c16a048f1","Type":"ContainerDied","Data":"140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710"} Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.721797 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlgt6" event={"ID":"20462654-ee50-4ee5-b117-fa1c16a048f1","Type":"ContainerDied","Data":"e3a63cc4ee8e74b52a72d79d491f99c5b05fd00dedd5da165f4218d93f61d0ff"} Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.721859 4749 scope.go:117] "RemoveContainer" containerID="140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.778181 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zlgt6"] Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.791499 4749 scope.go:117] "RemoveContainer" containerID="00d5ffb657bdd0d6d35d0157f33b35dce3e59ac4e124566b8c98f09c8797cdff" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.796458 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zlgt6"] Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.845268 4749 scope.go:117] "RemoveContainer" containerID="0e25c06c33f6d71aa9f530ee71a5f112a6616b5a2071e1e7dd109cbf1ee77df8" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.934341 4749 scope.go:117] "RemoveContainer" containerID="140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710" Oct 01 14:09:06 crc kubenswrapper[4749]: E1001 14:09:06.934863 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710\": container with ID starting with 140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710 not found: ID does not exist" containerID="140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.934903 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710"} err="failed to get container status \"140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710\": rpc error: code = NotFound desc = could not find container \"140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710\": container with ID starting with 140c7a3f070f902ed621428913b7df16d0f41f6ae663ce28284380aa2af92710 not found: ID does not exist" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.934934 4749 scope.go:117] "RemoveContainer" containerID="00d5ffb657bdd0d6d35d0157f33b35dce3e59ac4e124566b8c98f09c8797cdff" Oct 01 14:09:06 crc kubenswrapper[4749]: E1001 14:09:06.935173 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d5ffb657bdd0d6d35d0157f33b35dce3e59ac4e124566b8c98f09c8797cdff\": container with ID starting with 00d5ffb657bdd0d6d35d0157f33b35dce3e59ac4e124566b8c98f09c8797cdff not found: ID does not exist" containerID="00d5ffb657bdd0d6d35d0157f33b35dce3e59ac4e124566b8c98f09c8797cdff" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.935200 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d5ffb657bdd0d6d35d0157f33b35dce3e59ac4e124566b8c98f09c8797cdff"} err="failed to get container status \"00d5ffb657bdd0d6d35d0157f33b35dce3e59ac4e124566b8c98f09c8797cdff\": rpc error: code = NotFound desc = could not find container \"00d5ffb657bdd0d6d35d0157f33b35dce3e59ac4e124566b8c98f09c8797cdff\": container with ID starting with 00d5ffb657bdd0d6d35d0157f33b35dce3e59ac4e124566b8c98f09c8797cdff not found: ID does not exist" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.935240 4749 scope.go:117] "RemoveContainer" containerID="0e25c06c33f6d71aa9f530ee71a5f112a6616b5a2071e1e7dd109cbf1ee77df8" Oct 01 14:09:06 crc kubenswrapper[4749]: E1001 14:09:06.935511 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e25c06c33f6d71aa9f530ee71a5f112a6616b5a2071e1e7dd109cbf1ee77df8\": container with ID starting with 0e25c06c33f6d71aa9f530ee71a5f112a6616b5a2071e1e7dd109cbf1ee77df8 not found: ID does not exist" containerID="0e25c06c33f6d71aa9f530ee71a5f112a6616b5a2071e1e7dd109cbf1ee77df8" Oct 01 14:09:06 crc kubenswrapper[4749]: I1001 14:09:06.935593 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e25c06c33f6d71aa9f530ee71a5f112a6616b5a2071e1e7dd109cbf1ee77df8"} err="failed to get container status \"0e25c06c33f6d71aa9f530ee71a5f112a6616b5a2071e1e7dd109cbf1ee77df8\": rpc error: code = NotFound desc = could not find container \"0e25c06c33f6d71aa9f530ee71a5f112a6616b5a2071e1e7dd109cbf1ee77df8\": container with ID starting with 0e25c06c33f6d71aa9f530ee71a5f112a6616b5a2071e1e7dd109cbf1ee77df8 not found: ID does not exist" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.173644 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.195087 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47zrz\" (UniqueName: \"kubernetes.io/projected/548656b3-50ec-4c25-9cb8-067406a98efa-kube-api-access-47zrz\") pod \"548656b3-50ec-4c25-9cb8-067406a98efa\" (UID: \"548656b3-50ec-4c25-9cb8-067406a98efa\") " Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.195731 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/548656b3-50ec-4c25-9cb8-067406a98efa-utilities\") pod \"548656b3-50ec-4c25-9cb8-067406a98efa\" (UID: \"548656b3-50ec-4c25-9cb8-067406a98efa\") " Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.195887 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/548656b3-50ec-4c25-9cb8-067406a98efa-catalog-content\") pod \"548656b3-50ec-4c25-9cb8-067406a98efa\" (UID: \"548656b3-50ec-4c25-9cb8-067406a98efa\") " Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.196060 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/548656b3-50ec-4c25-9cb8-067406a98efa-utilities" (OuterVolumeSpecName: "utilities") pod "548656b3-50ec-4c25-9cb8-067406a98efa" (UID: "548656b3-50ec-4c25-9cb8-067406a98efa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.196440 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/548656b3-50ec-4c25-9cb8-067406a98efa-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.201357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/548656b3-50ec-4c25-9cb8-067406a98efa-kube-api-access-47zrz" (OuterVolumeSpecName: "kube-api-access-47zrz") pod "548656b3-50ec-4c25-9cb8-067406a98efa" (UID: "548656b3-50ec-4c25-9cb8-067406a98efa"). InnerVolumeSpecName "kube-api-access-47zrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.210947 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/548656b3-50ec-4c25-9cb8-067406a98efa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "548656b3-50ec-4c25-9cb8-067406a98efa" (UID: "548656b3-50ec-4c25-9cb8-067406a98efa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.241531 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20462654-ee50-4ee5-b117-fa1c16a048f1" path="/var/lib/kubelet/pods/20462654-ee50-4ee5-b117-fa1c16a048f1/volumes" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.298853 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47zrz\" (UniqueName: \"kubernetes.io/projected/548656b3-50ec-4c25-9cb8-067406a98efa-kube-api-access-47zrz\") on node \"crc\" DevicePath \"\"" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.298891 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/548656b3-50ec-4c25-9cb8-067406a98efa-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.735240 4749 generic.go:334] "Generic (PLEG): container finished" podID="548656b3-50ec-4c25-9cb8-067406a98efa" containerID="3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb" exitCode=0 Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.735293 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-459p5" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.735308 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-459p5" event={"ID":"548656b3-50ec-4c25-9cb8-067406a98efa","Type":"ContainerDied","Data":"3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb"} Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.736102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-459p5" event={"ID":"548656b3-50ec-4c25-9cb8-067406a98efa","Type":"ContainerDied","Data":"0d0292ca94fef16c47e655b67f44447c959e3da995853e40759ba24e11599f18"} Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.736163 4749 scope.go:117] "RemoveContainer" containerID="3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.763333 4749 scope.go:117] "RemoveContainer" containerID="eda47a142f6817d8fe96a727d13a03796b2fd67ac15a573e4ee3ec99e18717e3" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.766666 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-459p5"] Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.775391 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-459p5"] Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.790623 4749 scope.go:117] "RemoveContainer" containerID="433d556546dc5772c58126d3195a4a83d04d5073e86a2102e24c6638549f1756" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.810873 4749 scope.go:117] "RemoveContainer" containerID="3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb" Oct 01 14:09:07 crc kubenswrapper[4749]: E1001 14:09:07.811398 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb\": container with ID starting with 3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb not found: ID does not exist" containerID="3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.811443 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb"} err="failed to get container status \"3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb\": rpc error: code = NotFound desc = could not find container \"3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb\": container with ID starting with 3498fba2291bfac654779c2258a3143ca28f3d6181275746ea29177f1ee314bb not found: ID does not exist" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.811476 4749 scope.go:117] "RemoveContainer" containerID="eda47a142f6817d8fe96a727d13a03796b2fd67ac15a573e4ee3ec99e18717e3" Oct 01 14:09:07 crc kubenswrapper[4749]: E1001 14:09:07.811844 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda47a142f6817d8fe96a727d13a03796b2fd67ac15a573e4ee3ec99e18717e3\": container with ID starting with eda47a142f6817d8fe96a727d13a03796b2fd67ac15a573e4ee3ec99e18717e3 not found: ID does not exist" containerID="eda47a142f6817d8fe96a727d13a03796b2fd67ac15a573e4ee3ec99e18717e3" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.811894 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda47a142f6817d8fe96a727d13a03796b2fd67ac15a573e4ee3ec99e18717e3"} err="failed to get container status \"eda47a142f6817d8fe96a727d13a03796b2fd67ac15a573e4ee3ec99e18717e3\": rpc error: code = NotFound desc = could not find container \"eda47a142f6817d8fe96a727d13a03796b2fd67ac15a573e4ee3ec99e18717e3\": container with ID starting with eda47a142f6817d8fe96a727d13a03796b2fd67ac15a573e4ee3ec99e18717e3 not found: ID does not exist" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.811927 4749 scope.go:117] "RemoveContainer" containerID="433d556546dc5772c58126d3195a4a83d04d5073e86a2102e24c6638549f1756" Oct 01 14:09:07 crc kubenswrapper[4749]: E1001 14:09:07.812290 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"433d556546dc5772c58126d3195a4a83d04d5073e86a2102e24c6638549f1756\": container with ID starting with 433d556546dc5772c58126d3195a4a83d04d5073e86a2102e24c6638549f1756 not found: ID does not exist" containerID="433d556546dc5772c58126d3195a4a83d04d5073e86a2102e24c6638549f1756" Oct 01 14:09:07 crc kubenswrapper[4749]: I1001 14:09:07.812336 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"433d556546dc5772c58126d3195a4a83d04d5073e86a2102e24c6638549f1756"} err="failed to get container status \"433d556546dc5772c58126d3195a4a83d04d5073e86a2102e24c6638549f1756\": rpc error: code = NotFound desc = could not find container \"433d556546dc5772c58126d3195a4a83d04d5073e86a2102e24c6638549f1756\": container with ID starting with 433d556546dc5772c58126d3195a4a83d04d5073e86a2102e24c6638549f1756 not found: ID does not exist" Oct 01 14:09:09 crc kubenswrapper[4749]: I1001 14:09:09.245088 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="548656b3-50ec-4c25-9cb8-067406a98efa" path="/var/lib/kubelet/pods/548656b3-50ec-4c25-9cb8-067406a98efa/volumes" Oct 01 14:09:19 crc kubenswrapper[4749]: I1001 14:09:19.230089 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:09:19 crc kubenswrapper[4749]: E1001 14:09:19.231729 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:09:30 crc kubenswrapper[4749]: I1001 14:09:30.230672 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:09:30 crc kubenswrapper[4749]: E1001 14:09:30.232426 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:09:43 crc kubenswrapper[4749]: I1001 14:09:43.230827 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:09:43 crc kubenswrapper[4749]: E1001 14:09:43.231668 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:09:55 crc kubenswrapper[4749]: I1001 14:09:55.229881 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:09:55 crc kubenswrapper[4749]: E1001 14:09:55.230713 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:10:09 crc kubenswrapper[4749]: I1001 14:10:09.235420 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:10:09 crc kubenswrapper[4749]: E1001 14:10:09.236596 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:10:22 crc kubenswrapper[4749]: I1001 14:10:22.231280 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:10:22 crc kubenswrapper[4749]: E1001 14:10:22.232192 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:10:33 crc kubenswrapper[4749]: I1001 14:10:33.230100 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:10:33 crc kubenswrapper[4749]: I1001 14:10:33.703166 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"6c0ac914a040bc36ab565d8fab3c9b34d545ec6932e1a611cd03fabfac269dd7"} Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.207330 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2h584"] Oct 01 14:11:22 crc kubenswrapper[4749]: E1001 14:11:22.208671 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20462654-ee50-4ee5-b117-fa1c16a048f1" containerName="registry-server" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.208687 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="20462654-ee50-4ee5-b117-fa1c16a048f1" containerName="registry-server" Oct 01 14:11:22 crc kubenswrapper[4749]: E1001 14:11:22.208722 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20462654-ee50-4ee5-b117-fa1c16a048f1" containerName="extract-content" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.208732 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="20462654-ee50-4ee5-b117-fa1c16a048f1" containerName="extract-content" Oct 01 14:11:22 crc kubenswrapper[4749]: E1001 14:11:22.208763 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20462654-ee50-4ee5-b117-fa1c16a048f1" containerName="extract-utilities" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.208772 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="20462654-ee50-4ee5-b117-fa1c16a048f1" containerName="extract-utilities" Oct 01 14:11:22 crc kubenswrapper[4749]: E1001 14:11:22.208793 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548656b3-50ec-4c25-9cb8-067406a98efa" containerName="registry-server" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.208803 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="548656b3-50ec-4c25-9cb8-067406a98efa" containerName="registry-server" Oct 01 14:11:22 crc kubenswrapper[4749]: E1001 14:11:22.208843 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548656b3-50ec-4c25-9cb8-067406a98efa" containerName="extract-utilities" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.208851 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="548656b3-50ec-4c25-9cb8-067406a98efa" containerName="extract-utilities" Oct 01 14:11:22 crc kubenswrapper[4749]: E1001 14:11:22.208881 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548656b3-50ec-4c25-9cb8-067406a98efa" containerName="extract-content" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.208890 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="548656b3-50ec-4c25-9cb8-067406a98efa" containerName="extract-content" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.209393 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="20462654-ee50-4ee5-b117-fa1c16a048f1" containerName="registry-server" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.209425 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="548656b3-50ec-4c25-9cb8-067406a98efa" containerName="registry-server" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.213308 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.242885 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2h584"] Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.243287 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c2rf\" (UniqueName: \"kubernetes.io/projected/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-kube-api-access-9c2rf\") pod \"redhat-operators-2h584\" (UID: \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\") " pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.246851 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-catalog-content\") pod \"redhat-operators-2h584\" (UID: \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\") " pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.246915 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-utilities\") pod \"redhat-operators-2h584\" (UID: \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\") " pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.349343 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-catalog-content\") pod \"redhat-operators-2h584\" (UID: \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\") " pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.349430 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-utilities\") pod \"redhat-operators-2h584\" (UID: \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\") " pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.349479 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c2rf\" (UniqueName: \"kubernetes.io/projected/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-kube-api-access-9c2rf\") pod \"redhat-operators-2h584\" (UID: \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\") " pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.350702 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-utilities\") pod \"redhat-operators-2h584\" (UID: \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\") " pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.350883 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-catalog-content\") pod \"redhat-operators-2h584\" (UID: \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\") " pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.369662 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c2rf\" (UniqueName: \"kubernetes.io/projected/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-kube-api-access-9c2rf\") pod \"redhat-operators-2h584\" (UID: \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\") " pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:22 crc kubenswrapper[4749]: I1001 14:11:22.547970 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:23 crc kubenswrapper[4749]: I1001 14:11:23.145065 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2h584"] Oct 01 14:11:23 crc kubenswrapper[4749]: I1001 14:11:23.287121 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2h584" event={"ID":"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce","Type":"ContainerStarted","Data":"be928f8136acdf118e2302774c92769d0bc5d3bc37b5a640ef9788f411c6a14b"} Oct 01 14:11:24 crc kubenswrapper[4749]: I1001 14:11:24.298882 4749 generic.go:334] "Generic (PLEG): container finished" podID="ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" containerID="416d149154cef455b3f4c638563fa04a1447ae69600b4bed4bb7e5a2a1c90ca5" exitCode=0 Oct 01 14:11:24 crc kubenswrapper[4749]: I1001 14:11:24.299005 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2h584" event={"ID":"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce","Type":"ContainerDied","Data":"416d149154cef455b3f4c638563fa04a1447ae69600b4bed4bb7e5a2a1c90ca5"} Oct 01 14:11:25 crc kubenswrapper[4749]: I1001 14:11:25.316369 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2h584" event={"ID":"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce","Type":"ContainerStarted","Data":"52303ce083b0d4858b4855c0e9004b6f5457532fd6482608b93cc33bc43ec5cd"} Oct 01 14:11:26 crc kubenswrapper[4749]: I1001 14:11:26.326885 4749 generic.go:334] "Generic (PLEG): container finished" podID="ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" containerID="52303ce083b0d4858b4855c0e9004b6f5457532fd6482608b93cc33bc43ec5cd" exitCode=0 Oct 01 14:11:26 crc kubenswrapper[4749]: I1001 14:11:26.326969 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2h584" event={"ID":"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce","Type":"ContainerDied","Data":"52303ce083b0d4858b4855c0e9004b6f5457532fd6482608b93cc33bc43ec5cd"} Oct 01 14:11:27 crc kubenswrapper[4749]: I1001 14:11:27.337389 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2h584" event={"ID":"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce","Type":"ContainerStarted","Data":"c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78"} Oct 01 14:11:27 crc kubenswrapper[4749]: I1001 14:11:27.374036 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2h584" podStartSLOduration=2.7625559060000002 podStartE2EDuration="5.374012581s" podCreationTimestamp="2025-10-01 14:11:22 +0000 UTC" firstStartedPulling="2025-10-01 14:11:24.301503694 +0000 UTC m=+3944.355488603" lastFinishedPulling="2025-10-01 14:11:26.912960379 +0000 UTC m=+3946.966945278" observedRunningTime="2025-10-01 14:11:27.36564463 +0000 UTC m=+3947.419629539" watchObservedRunningTime="2025-10-01 14:11:27.374012581 +0000 UTC m=+3947.427997480" Oct 01 14:11:32 crc kubenswrapper[4749]: I1001 14:11:32.548330 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:32 crc kubenswrapper[4749]: I1001 14:11:32.549521 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:32 crc kubenswrapper[4749]: I1001 14:11:32.603863 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:33 crc kubenswrapper[4749]: I1001 14:11:33.456843 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:33 crc kubenswrapper[4749]: I1001 14:11:33.507475 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2h584"] Oct 01 14:11:35 crc kubenswrapper[4749]: I1001 14:11:35.425747 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2h584" podUID="ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" containerName="registry-server" containerID="cri-o://c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78" gracePeriod=2 Oct 01 14:11:35 crc kubenswrapper[4749]: E1001 14:11:35.756646 4749 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae0b81f7_e13f_48c8_be19_fdd8b2a34cce.slice/crio-c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae0b81f7_e13f_48c8_be19_fdd8b2a34cce.slice/crio-conmon-c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78.scope\": RecentStats: unable to find data in memory cache]" Oct 01 14:11:35 crc kubenswrapper[4749]: I1001 14:11:35.972184 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.117247 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-catalog-content\") pod \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\" (UID: \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\") " Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.117642 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-utilities\") pod \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\" (UID: \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\") " Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.117832 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c2rf\" (UniqueName: \"kubernetes.io/projected/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-kube-api-access-9c2rf\") pod \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\" (UID: \"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce\") " Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.118346 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-utilities" (OuterVolumeSpecName: "utilities") pod "ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" (UID: "ae0b81f7-e13f-48c8-be19-fdd8b2a34cce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.118621 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.123168 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-kube-api-access-9c2rf" (OuterVolumeSpecName: "kube-api-access-9c2rf") pod "ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" (UID: "ae0b81f7-e13f-48c8-be19-fdd8b2a34cce"). InnerVolumeSpecName "kube-api-access-9c2rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.205979 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" (UID: "ae0b81f7-e13f-48c8-be19-fdd8b2a34cce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.220337 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c2rf\" (UniqueName: \"kubernetes.io/projected/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-kube-api-access-9c2rf\") on node \"crc\" DevicePath \"\"" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.220366 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.438636 4749 generic.go:334] "Generic (PLEG): container finished" podID="ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" containerID="c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78" exitCode=0 Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.438703 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2h584" event={"ID":"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce","Type":"ContainerDied","Data":"c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78"} Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.438935 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2h584" event={"ID":"ae0b81f7-e13f-48c8-be19-fdd8b2a34cce","Type":"ContainerDied","Data":"be928f8136acdf118e2302774c92769d0bc5d3bc37b5a640ef9788f411c6a14b"} Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.438960 4749 scope.go:117] "RemoveContainer" containerID="c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.438790 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2h584" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.467538 4749 scope.go:117] "RemoveContainer" containerID="52303ce083b0d4858b4855c0e9004b6f5457532fd6482608b93cc33bc43ec5cd" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.472153 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2h584"] Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.483508 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2h584"] Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.498669 4749 scope.go:117] "RemoveContainer" containerID="416d149154cef455b3f4c638563fa04a1447ae69600b4bed4bb7e5a2a1c90ca5" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.537703 4749 scope.go:117] "RemoveContainer" containerID="c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78" Oct 01 14:11:36 crc kubenswrapper[4749]: E1001 14:11:36.538137 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78\": container with ID starting with c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78 not found: ID does not exist" containerID="c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.538168 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78"} err="failed to get container status \"c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78\": rpc error: code = NotFound desc = could not find container \"c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78\": container with ID starting with c3c248ee5a00ebecb26ae163a8f5a1d610e3094789a48f7f7d2f7c8452684b78 not found: ID does not exist" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.538189 4749 scope.go:117] "RemoveContainer" containerID="52303ce083b0d4858b4855c0e9004b6f5457532fd6482608b93cc33bc43ec5cd" Oct 01 14:11:36 crc kubenswrapper[4749]: E1001 14:11:36.538504 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52303ce083b0d4858b4855c0e9004b6f5457532fd6482608b93cc33bc43ec5cd\": container with ID starting with 52303ce083b0d4858b4855c0e9004b6f5457532fd6482608b93cc33bc43ec5cd not found: ID does not exist" containerID="52303ce083b0d4858b4855c0e9004b6f5457532fd6482608b93cc33bc43ec5cd" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.538524 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52303ce083b0d4858b4855c0e9004b6f5457532fd6482608b93cc33bc43ec5cd"} err="failed to get container status \"52303ce083b0d4858b4855c0e9004b6f5457532fd6482608b93cc33bc43ec5cd\": rpc error: code = NotFound desc = could not find container \"52303ce083b0d4858b4855c0e9004b6f5457532fd6482608b93cc33bc43ec5cd\": container with ID starting with 52303ce083b0d4858b4855c0e9004b6f5457532fd6482608b93cc33bc43ec5cd not found: ID does not exist" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.538536 4749 scope.go:117] "RemoveContainer" containerID="416d149154cef455b3f4c638563fa04a1447ae69600b4bed4bb7e5a2a1c90ca5" Oct 01 14:11:36 crc kubenswrapper[4749]: E1001 14:11:36.539107 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416d149154cef455b3f4c638563fa04a1447ae69600b4bed4bb7e5a2a1c90ca5\": container with ID starting with 416d149154cef455b3f4c638563fa04a1447ae69600b4bed4bb7e5a2a1c90ca5 not found: ID does not exist" containerID="416d149154cef455b3f4c638563fa04a1447ae69600b4bed4bb7e5a2a1c90ca5" Oct 01 14:11:36 crc kubenswrapper[4749]: I1001 14:11:36.539162 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416d149154cef455b3f4c638563fa04a1447ae69600b4bed4bb7e5a2a1c90ca5"} err="failed to get container status \"416d149154cef455b3f4c638563fa04a1447ae69600b4bed4bb7e5a2a1c90ca5\": rpc error: code = NotFound desc = could not find container \"416d149154cef455b3f4c638563fa04a1447ae69600b4bed4bb7e5a2a1c90ca5\": container with ID starting with 416d149154cef455b3f4c638563fa04a1447ae69600b4bed4bb7e5a2a1c90ca5 not found: ID does not exist" Oct 01 14:11:37 crc kubenswrapper[4749]: I1001 14:11:37.242086 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" path="/var/lib/kubelet/pods/ae0b81f7-e13f-48c8-be19-fdd8b2a34cce/volumes" Oct 01 14:13:02 crc kubenswrapper[4749]: I1001 14:13:02.106079 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:13:02 crc kubenswrapper[4749]: I1001 14:13:02.106751 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:13:32 crc kubenswrapper[4749]: I1001 14:13:32.106962 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:13:32 crc kubenswrapper[4749]: I1001 14:13:32.107489 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:14:02 crc kubenswrapper[4749]: I1001 14:14:02.106788 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:14:02 crc kubenswrapper[4749]: I1001 14:14:02.107329 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:14:02 crc kubenswrapper[4749]: I1001 14:14:02.107371 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 14:14:02 crc kubenswrapper[4749]: I1001 14:14:02.108050 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c0ac914a040bc36ab565d8fab3c9b34d545ec6932e1a611cd03fabfac269dd7"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:14:02 crc kubenswrapper[4749]: I1001 14:14:02.108098 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://6c0ac914a040bc36ab565d8fab3c9b34d545ec6932e1a611cd03fabfac269dd7" gracePeriod=600 Oct 01 14:14:02 crc kubenswrapper[4749]: I1001 14:14:02.969869 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="6c0ac914a040bc36ab565d8fab3c9b34d545ec6932e1a611cd03fabfac269dd7" exitCode=0 Oct 01 14:14:02 crc kubenswrapper[4749]: I1001 14:14:02.969917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"6c0ac914a040bc36ab565d8fab3c9b34d545ec6932e1a611cd03fabfac269dd7"} Oct 01 14:14:02 crc kubenswrapper[4749]: I1001 14:14:02.970839 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404"} Oct 01 14:14:02 crc kubenswrapper[4749]: I1001 14:14:02.970924 4749 scope.go:117] "RemoveContainer" containerID="59d32d882754a9cec654e2bb49d996d5ea6dbaad211d05eb2aab4a6662a4cdda" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.670262 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-n9s56"] Oct 01 14:14:55 crc kubenswrapper[4749]: E1001 14:14:55.677812 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" containerName="extract-content" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.678152 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" containerName="extract-content" Oct 01 14:14:55 crc kubenswrapper[4749]: E1001 14:14:55.678191 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" containerName="extract-utilities" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.678201 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" containerName="extract-utilities" Oct 01 14:14:55 crc kubenswrapper[4749]: E1001 14:14:55.678268 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" containerName="registry-server" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.678279 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" containerName="registry-server" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.678552 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae0b81f7-e13f-48c8-be19-fdd8b2a34cce" containerName="registry-server" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.680432 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.691025 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n9s56"] Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.795990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kh6g\" (UniqueName: \"kubernetes.io/projected/d20a67e6-1bc9-4c6a-8950-550156f0352e-kube-api-access-4kh6g\") pod \"certified-operators-n9s56\" (UID: \"d20a67e6-1bc9-4c6a-8950-550156f0352e\") " pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.796061 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d20a67e6-1bc9-4c6a-8950-550156f0352e-catalog-content\") pod \"certified-operators-n9s56\" (UID: \"d20a67e6-1bc9-4c6a-8950-550156f0352e\") " pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.796095 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d20a67e6-1bc9-4c6a-8950-550156f0352e-utilities\") pod \"certified-operators-n9s56\" (UID: \"d20a67e6-1bc9-4c6a-8950-550156f0352e\") " pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.897811 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d20a67e6-1bc9-4c6a-8950-550156f0352e-catalog-content\") pod \"certified-operators-n9s56\" (UID: \"d20a67e6-1bc9-4c6a-8950-550156f0352e\") " pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.897885 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d20a67e6-1bc9-4c6a-8950-550156f0352e-utilities\") pod \"certified-operators-n9s56\" (UID: \"d20a67e6-1bc9-4c6a-8950-550156f0352e\") " pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.898113 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kh6g\" (UniqueName: \"kubernetes.io/projected/d20a67e6-1bc9-4c6a-8950-550156f0352e-kube-api-access-4kh6g\") pod \"certified-operators-n9s56\" (UID: \"d20a67e6-1bc9-4c6a-8950-550156f0352e\") " pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.898317 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d20a67e6-1bc9-4c6a-8950-550156f0352e-catalog-content\") pod \"certified-operators-n9s56\" (UID: \"d20a67e6-1bc9-4c6a-8950-550156f0352e\") " pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.898493 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d20a67e6-1bc9-4c6a-8950-550156f0352e-utilities\") pod \"certified-operators-n9s56\" (UID: \"d20a67e6-1bc9-4c6a-8950-550156f0352e\") " pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:14:55 crc kubenswrapper[4749]: I1001 14:14:55.923617 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kh6g\" (UniqueName: \"kubernetes.io/projected/d20a67e6-1bc9-4c6a-8950-550156f0352e-kube-api-access-4kh6g\") pod \"certified-operators-n9s56\" (UID: \"d20a67e6-1bc9-4c6a-8950-550156f0352e\") " pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:14:56 crc kubenswrapper[4749]: I1001 14:14:56.011805 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:14:56 crc kubenswrapper[4749]: I1001 14:14:56.564327 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-n9s56"] Oct 01 14:14:57 crc kubenswrapper[4749]: I1001 14:14:57.515740 4749 generic.go:334] "Generic (PLEG): container finished" podID="d20a67e6-1bc9-4c6a-8950-550156f0352e" containerID="252fa007a0c4b1d705162dc8a429751a82f534d5b0e76a81c017ff823a9b8ba6" exitCode=0 Oct 01 14:14:57 crc kubenswrapper[4749]: I1001 14:14:57.515800 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9s56" event={"ID":"d20a67e6-1bc9-4c6a-8950-550156f0352e","Type":"ContainerDied","Data":"252fa007a0c4b1d705162dc8a429751a82f534d5b0e76a81c017ff823a9b8ba6"} Oct 01 14:14:57 crc kubenswrapper[4749]: I1001 14:14:57.516027 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9s56" event={"ID":"d20a67e6-1bc9-4c6a-8950-550156f0352e","Type":"ContainerStarted","Data":"5f57fde46adf698e83e87d8f4cff28702b1ca33c7cdb0fb4d83910250e181078"} Oct 01 14:14:57 crc kubenswrapper[4749]: I1001 14:14:57.518753 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:14:59 crc kubenswrapper[4749]: I1001 14:14:59.533279 4749 generic.go:334] "Generic (PLEG): container finished" podID="d20a67e6-1bc9-4c6a-8950-550156f0352e" containerID="9e5cd1b7aa9cff90a909ce958ef3044a047fc4eebc95a60c39a0ff1291148f5d" exitCode=0 Oct 01 14:14:59 crc kubenswrapper[4749]: I1001 14:14:59.533364 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9s56" event={"ID":"d20a67e6-1bc9-4c6a-8950-550156f0352e","Type":"ContainerDied","Data":"9e5cd1b7aa9cff90a909ce958ef3044a047fc4eebc95a60c39a0ff1291148f5d"} Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.154904 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt"] Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.167022 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt"] Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.167126 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.176449 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.176508 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.209758 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp62z\" (UniqueName: \"kubernetes.io/projected/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-kube-api-access-lp62z\") pod \"collect-profiles-29322135-sbfgt\" (UID: \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.209836 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-secret-volume\") pod \"collect-profiles-29322135-sbfgt\" (UID: \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.209985 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-config-volume\") pod \"collect-profiles-29322135-sbfgt\" (UID: \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.311829 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp62z\" (UniqueName: \"kubernetes.io/projected/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-kube-api-access-lp62z\") pod \"collect-profiles-29322135-sbfgt\" (UID: \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.311893 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-secret-volume\") pod \"collect-profiles-29322135-sbfgt\" (UID: \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.311995 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-config-volume\") pod \"collect-profiles-29322135-sbfgt\" (UID: \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.313374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-config-volume\") pod \"collect-profiles-29322135-sbfgt\" (UID: \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.318525 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-secret-volume\") pod \"collect-profiles-29322135-sbfgt\" (UID: \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.331020 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp62z\" (UniqueName: \"kubernetes.io/projected/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-kube-api-access-lp62z\") pod \"collect-profiles-29322135-sbfgt\" (UID: \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.531310 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.551114 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9s56" event={"ID":"d20a67e6-1bc9-4c6a-8950-550156f0352e","Type":"ContainerStarted","Data":"bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc"} Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.570876 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-n9s56" podStartSLOduration=3.072244205 podStartE2EDuration="5.5708542s" podCreationTimestamp="2025-10-01 14:14:55 +0000 UTC" firstStartedPulling="2025-10-01 14:14:57.518177697 +0000 UTC m=+4157.572162616" lastFinishedPulling="2025-10-01 14:15:00.016787712 +0000 UTC m=+4160.070772611" observedRunningTime="2025-10-01 14:15:00.569933144 +0000 UTC m=+4160.623918043" watchObservedRunningTime="2025-10-01 14:15:00.5708542 +0000 UTC m=+4160.624839109" Oct 01 14:15:00 crc kubenswrapper[4749]: W1001 14:15:00.973461 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29a870d8_d918_4699_9d4a_ce28b9b3d8fc.slice/crio-0613729e4e445fa5668d2b50fe1b486c7ec6c04c2e2e17b574f35c0b21b026e2 WatchSource:0}: Error finding container 0613729e4e445fa5668d2b50fe1b486c7ec6c04c2e2e17b574f35c0b21b026e2: Status 404 returned error can't find the container with id 0613729e4e445fa5668d2b50fe1b486c7ec6c04c2e2e17b574f35c0b21b026e2 Oct 01 14:15:00 crc kubenswrapper[4749]: I1001 14:15:00.973570 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt"] Oct 01 14:15:01 crc kubenswrapper[4749]: I1001 14:15:01.562131 4749 generic.go:334] "Generic (PLEG): container finished" podID="29a870d8-d918-4699-9d4a-ce28b9b3d8fc" containerID="353a6267411ff830fb4139639016d0f4f29abc52d9082cc920721d6355a97d77" exitCode=0 Oct 01 14:15:01 crc kubenswrapper[4749]: I1001 14:15:01.562211 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" event={"ID":"29a870d8-d918-4699-9d4a-ce28b9b3d8fc","Type":"ContainerDied","Data":"353a6267411ff830fb4139639016d0f4f29abc52d9082cc920721d6355a97d77"} Oct 01 14:15:01 crc kubenswrapper[4749]: I1001 14:15:01.563414 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" event={"ID":"29a870d8-d918-4699-9d4a-ce28b9b3d8fc","Type":"ContainerStarted","Data":"0613729e4e445fa5668d2b50fe1b486c7ec6c04c2e2e17b574f35c0b21b026e2"} Oct 01 14:15:03 crc kubenswrapper[4749]: I1001 14:15:03.585980 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" event={"ID":"29a870d8-d918-4699-9d4a-ce28b9b3d8fc","Type":"ContainerDied","Data":"0613729e4e445fa5668d2b50fe1b486c7ec6c04c2e2e17b574f35c0b21b026e2"} Oct 01 14:15:03 crc kubenswrapper[4749]: I1001 14:15:03.586492 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0613729e4e445fa5668d2b50fe1b486c7ec6c04c2e2e17b574f35c0b21b026e2" Oct 01 14:15:03 crc kubenswrapper[4749]: I1001 14:15:03.837908 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:03 crc kubenswrapper[4749]: I1001 14:15:03.986861 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-secret-volume\") pod \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\" (UID: \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\") " Oct 01 14:15:03 crc kubenswrapper[4749]: I1001 14:15:03.986951 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-config-volume\") pod \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\" (UID: \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\") " Oct 01 14:15:03 crc kubenswrapper[4749]: I1001 14:15:03.987199 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp62z\" (UniqueName: \"kubernetes.io/projected/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-kube-api-access-lp62z\") pod \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\" (UID: \"29a870d8-d918-4699-9d4a-ce28b9b3d8fc\") " Oct 01 14:15:03 crc kubenswrapper[4749]: I1001 14:15:03.988470 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-config-volume" (OuterVolumeSpecName: "config-volume") pod "29a870d8-d918-4699-9d4a-ce28b9b3d8fc" (UID: "29a870d8-d918-4699-9d4a-ce28b9b3d8fc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:15:03 crc kubenswrapper[4749]: I1001 14:15:03.995609 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-kube-api-access-lp62z" (OuterVolumeSpecName: "kube-api-access-lp62z") pod "29a870d8-d918-4699-9d4a-ce28b9b3d8fc" (UID: "29a870d8-d918-4699-9d4a-ce28b9b3d8fc"). InnerVolumeSpecName "kube-api-access-lp62z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:15:03 crc kubenswrapper[4749]: I1001 14:15:03.995688 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "29a870d8-d918-4699-9d4a-ce28b9b3d8fc" (UID: "29a870d8-d918-4699-9d4a-ce28b9b3d8fc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:15:04 crc kubenswrapper[4749]: I1001 14:15:04.090549 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:04 crc kubenswrapper[4749]: I1001 14:15:04.090714 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp62z\" (UniqueName: \"kubernetes.io/projected/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-kube-api-access-lp62z\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:04 crc kubenswrapper[4749]: I1001 14:15:04.090746 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/29a870d8-d918-4699-9d4a-ce28b9b3d8fc-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:04 crc kubenswrapper[4749]: I1001 14:15:04.597415 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322135-sbfgt" Oct 01 14:15:04 crc kubenswrapper[4749]: I1001 14:15:04.937985 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn"] Oct 01 14:15:04 crc kubenswrapper[4749]: I1001 14:15:04.949085 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322090-tbrsn"] Oct 01 14:15:05 crc kubenswrapper[4749]: I1001 14:15:05.240802 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f6bfb71-014e-4eca-8a3f-5e93745f039f" path="/var/lib/kubelet/pods/5f6bfb71-014e-4eca-8a3f-5e93745f039f/volumes" Oct 01 14:15:06 crc kubenswrapper[4749]: I1001 14:15:06.011929 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:15:06 crc kubenswrapper[4749]: I1001 14:15:06.011987 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:15:06 crc kubenswrapper[4749]: I1001 14:15:06.065754 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:15:06 crc kubenswrapper[4749]: I1001 14:15:06.668797 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:15:06 crc kubenswrapper[4749]: I1001 14:15:06.714826 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n9s56"] Oct 01 14:15:08 crc kubenswrapper[4749]: I1001 14:15:08.634666 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-n9s56" podUID="d20a67e6-1bc9-4c6a-8950-550156f0352e" containerName="registry-server" containerID="cri-o://bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc" gracePeriod=2 Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.095941 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.208689 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kh6g\" (UniqueName: \"kubernetes.io/projected/d20a67e6-1bc9-4c6a-8950-550156f0352e-kube-api-access-4kh6g\") pod \"d20a67e6-1bc9-4c6a-8950-550156f0352e\" (UID: \"d20a67e6-1bc9-4c6a-8950-550156f0352e\") " Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.208755 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d20a67e6-1bc9-4c6a-8950-550156f0352e-utilities\") pod \"d20a67e6-1bc9-4c6a-8950-550156f0352e\" (UID: \"d20a67e6-1bc9-4c6a-8950-550156f0352e\") " Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.208778 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d20a67e6-1bc9-4c6a-8950-550156f0352e-catalog-content\") pod \"d20a67e6-1bc9-4c6a-8950-550156f0352e\" (UID: \"d20a67e6-1bc9-4c6a-8950-550156f0352e\") " Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.210082 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d20a67e6-1bc9-4c6a-8950-550156f0352e-utilities" (OuterVolumeSpecName: "utilities") pod "d20a67e6-1bc9-4c6a-8950-550156f0352e" (UID: "d20a67e6-1bc9-4c6a-8950-550156f0352e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.214065 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d20a67e6-1bc9-4c6a-8950-550156f0352e-kube-api-access-4kh6g" (OuterVolumeSpecName: "kube-api-access-4kh6g") pod "d20a67e6-1bc9-4c6a-8950-550156f0352e" (UID: "d20a67e6-1bc9-4c6a-8950-550156f0352e"). InnerVolumeSpecName "kube-api-access-4kh6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.259876 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d20a67e6-1bc9-4c6a-8950-550156f0352e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d20a67e6-1bc9-4c6a-8950-550156f0352e" (UID: "d20a67e6-1bc9-4c6a-8950-550156f0352e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.314558 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kh6g\" (UniqueName: \"kubernetes.io/projected/d20a67e6-1bc9-4c6a-8950-550156f0352e-kube-api-access-4kh6g\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.314971 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d20a67e6-1bc9-4c6a-8950-550156f0352e-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.314995 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d20a67e6-1bc9-4c6a-8950-550156f0352e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.648105 4749 generic.go:334] "Generic (PLEG): container finished" podID="d20a67e6-1bc9-4c6a-8950-550156f0352e" containerID="bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc" exitCode=0 Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.648208 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9s56" event={"ID":"d20a67e6-1bc9-4c6a-8950-550156f0352e","Type":"ContainerDied","Data":"bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc"} Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.648472 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-n9s56" event={"ID":"d20a67e6-1bc9-4c6a-8950-550156f0352e","Type":"ContainerDied","Data":"5f57fde46adf698e83e87d8f4cff28702b1ca33c7cdb0fb4d83910250e181078"} Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.648496 4749 scope.go:117] "RemoveContainer" containerID="bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.648285 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-n9s56" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.688194 4749 scope.go:117] "RemoveContainer" containerID="9e5cd1b7aa9cff90a909ce958ef3044a047fc4eebc95a60c39a0ff1291148f5d" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.709353 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-n9s56"] Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.718535 4749 scope.go:117] "RemoveContainer" containerID="252fa007a0c4b1d705162dc8a429751a82f534d5b0e76a81c017ff823a9b8ba6" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.720941 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-n9s56"] Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.771085 4749 scope.go:117] "RemoveContainer" containerID="bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc" Oct 01 14:15:09 crc kubenswrapper[4749]: E1001 14:15:09.771549 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc\": container with ID starting with bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc not found: ID does not exist" containerID="bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.771583 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc"} err="failed to get container status \"bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc\": rpc error: code = NotFound desc = could not find container \"bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc\": container with ID starting with bbb1ecbe3523cf2c6e32c48242d5521d31a9fd8147cda6dd6f30358a26c47cbc not found: ID does not exist" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.771608 4749 scope.go:117] "RemoveContainer" containerID="9e5cd1b7aa9cff90a909ce958ef3044a047fc4eebc95a60c39a0ff1291148f5d" Oct 01 14:15:09 crc kubenswrapper[4749]: E1001 14:15:09.771921 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e5cd1b7aa9cff90a909ce958ef3044a047fc4eebc95a60c39a0ff1291148f5d\": container with ID starting with 9e5cd1b7aa9cff90a909ce958ef3044a047fc4eebc95a60c39a0ff1291148f5d not found: ID does not exist" containerID="9e5cd1b7aa9cff90a909ce958ef3044a047fc4eebc95a60c39a0ff1291148f5d" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.771957 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5cd1b7aa9cff90a909ce958ef3044a047fc4eebc95a60c39a0ff1291148f5d"} err="failed to get container status \"9e5cd1b7aa9cff90a909ce958ef3044a047fc4eebc95a60c39a0ff1291148f5d\": rpc error: code = NotFound desc = could not find container \"9e5cd1b7aa9cff90a909ce958ef3044a047fc4eebc95a60c39a0ff1291148f5d\": container with ID starting with 9e5cd1b7aa9cff90a909ce958ef3044a047fc4eebc95a60c39a0ff1291148f5d not found: ID does not exist" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.771978 4749 scope.go:117] "RemoveContainer" containerID="252fa007a0c4b1d705162dc8a429751a82f534d5b0e76a81c017ff823a9b8ba6" Oct 01 14:15:09 crc kubenswrapper[4749]: E1001 14:15:09.772332 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"252fa007a0c4b1d705162dc8a429751a82f534d5b0e76a81c017ff823a9b8ba6\": container with ID starting with 252fa007a0c4b1d705162dc8a429751a82f534d5b0e76a81c017ff823a9b8ba6 not found: ID does not exist" containerID="252fa007a0c4b1d705162dc8a429751a82f534d5b0e76a81c017ff823a9b8ba6" Oct 01 14:15:09 crc kubenswrapper[4749]: I1001 14:15:09.772370 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"252fa007a0c4b1d705162dc8a429751a82f534d5b0e76a81c017ff823a9b8ba6"} err="failed to get container status \"252fa007a0c4b1d705162dc8a429751a82f534d5b0e76a81c017ff823a9b8ba6\": rpc error: code = NotFound desc = could not find container \"252fa007a0c4b1d705162dc8a429751a82f534d5b0e76a81c017ff823a9b8ba6\": container with ID starting with 252fa007a0c4b1d705162dc8a429751a82f534d5b0e76a81c017ff823a9b8ba6 not found: ID does not exist" Oct 01 14:15:10 crc kubenswrapper[4749]: I1001 14:15:10.598882 4749 scope.go:117] "RemoveContainer" containerID="23ea4160487656a8788c18b91e0711534428f68278dd4495e03ad94b070dc433" Oct 01 14:15:11 crc kubenswrapper[4749]: I1001 14:15:11.254155 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d20a67e6-1bc9-4c6a-8950-550156f0352e" path="/var/lib/kubelet/pods/d20a67e6-1bc9-4c6a-8950-550156f0352e/volumes" Oct 01 14:16:02 crc kubenswrapper[4749]: I1001 14:16:02.106035 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:16:02 crc kubenswrapper[4749]: I1001 14:16:02.106658 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:16:32 crc kubenswrapper[4749]: I1001 14:16:32.106165 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:16:32 crc kubenswrapper[4749]: I1001 14:16:32.106808 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:17:02 crc kubenswrapper[4749]: I1001 14:17:02.106194 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:17:02 crc kubenswrapper[4749]: I1001 14:17:02.106737 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:17:02 crc kubenswrapper[4749]: I1001 14:17:02.106787 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 14:17:02 crc kubenswrapper[4749]: I1001 14:17:02.107691 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:17:02 crc kubenswrapper[4749]: I1001 14:17:02.107774 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" gracePeriod=600 Oct 01 14:17:02 crc kubenswrapper[4749]: E1001 14:17:02.248859 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:17:02 crc kubenswrapper[4749]: I1001 14:17:02.767826 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" exitCode=0 Oct 01 14:17:02 crc kubenswrapper[4749]: I1001 14:17:02.767873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404"} Oct 01 14:17:02 crc kubenswrapper[4749]: I1001 14:17:02.767947 4749 scope.go:117] "RemoveContainer" containerID="6c0ac914a040bc36ab565d8fab3c9b34d545ec6932e1a611cd03fabfac269dd7" Oct 01 14:17:02 crc kubenswrapper[4749]: I1001 14:17:02.768570 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:17:02 crc kubenswrapper[4749]: E1001 14:17:02.768829 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:17:14 crc kubenswrapper[4749]: I1001 14:17:14.229691 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:17:14 crc kubenswrapper[4749]: E1001 14:17:14.230509 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:17:28 crc kubenswrapper[4749]: I1001 14:17:28.230310 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:17:28 crc kubenswrapper[4749]: E1001 14:17:28.231121 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:17:43 crc kubenswrapper[4749]: I1001 14:17:43.230308 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:17:43 crc kubenswrapper[4749]: E1001 14:17:43.231149 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:17:54 crc kubenswrapper[4749]: I1001 14:17:54.230307 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:17:54 crc kubenswrapper[4749]: E1001 14:17:54.231067 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:18:08 crc kubenswrapper[4749]: I1001 14:18:08.231670 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:18:08 crc kubenswrapper[4749]: E1001 14:18:08.232521 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:18:19 crc kubenswrapper[4749]: I1001 14:18:19.230668 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:18:19 crc kubenswrapper[4749]: E1001 14:18:19.231438 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:18:32 crc kubenswrapper[4749]: I1001 14:18:32.230786 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:18:32 crc kubenswrapper[4749]: E1001 14:18:32.232862 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:18:46 crc kubenswrapper[4749]: I1001 14:18:46.229973 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:18:46 crc kubenswrapper[4749]: E1001 14:18:46.230725 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:19:01 crc kubenswrapper[4749]: I1001 14:19:01.236059 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:19:01 crc kubenswrapper[4749]: E1001 14:19:01.236854 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:19:13 crc kubenswrapper[4749]: I1001 14:19:13.230679 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:19:13 crc kubenswrapper[4749]: E1001 14:19:13.231587 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:19:25 crc kubenswrapper[4749]: I1001 14:19:25.230633 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:19:25 crc kubenswrapper[4749]: E1001 14:19:25.231977 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:19:37 crc kubenswrapper[4749]: I1001 14:19:37.230498 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:19:37 crc kubenswrapper[4749]: E1001 14:19:37.231763 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:19:50 crc kubenswrapper[4749]: I1001 14:19:50.230361 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:19:50 crc kubenswrapper[4749]: E1001 14:19:50.231102 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:20:03 crc kubenswrapper[4749]: I1001 14:20:03.231116 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:20:03 crc kubenswrapper[4749]: E1001 14:20:03.231878 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:20:17 crc kubenswrapper[4749]: I1001 14:20:17.230177 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:20:17 crc kubenswrapper[4749]: E1001 14:20:17.230939 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.465921 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w95dc"] Oct 01 14:20:22 crc kubenswrapper[4749]: E1001 14:20:22.467165 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20a67e6-1bc9-4c6a-8950-550156f0352e" containerName="registry-server" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.467186 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20a67e6-1bc9-4c6a-8950-550156f0352e" containerName="registry-server" Oct 01 14:20:22 crc kubenswrapper[4749]: E1001 14:20:22.467325 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a870d8-d918-4699-9d4a-ce28b9b3d8fc" containerName="collect-profiles" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.467339 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a870d8-d918-4699-9d4a-ce28b9b3d8fc" containerName="collect-profiles" Oct 01 14:20:22 crc kubenswrapper[4749]: E1001 14:20:22.467380 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20a67e6-1bc9-4c6a-8950-550156f0352e" containerName="extract-utilities" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.467393 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20a67e6-1bc9-4c6a-8950-550156f0352e" containerName="extract-utilities" Oct 01 14:20:22 crc kubenswrapper[4749]: E1001 14:20:22.467451 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d20a67e6-1bc9-4c6a-8950-550156f0352e" containerName="extract-content" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.467467 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d20a67e6-1bc9-4c6a-8950-550156f0352e" containerName="extract-content" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.467851 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d20a67e6-1bc9-4c6a-8950-550156f0352e" containerName="registry-server" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.467907 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a870d8-d918-4699-9d4a-ce28b9b3d8fc" containerName="collect-profiles" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.470448 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.477158 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w95dc"] Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.528749 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a08f06b-9a80-43ee-9fef-0e722720ce8e-utilities\") pod \"community-operators-w95dc\" (UID: \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\") " pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.529112 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwvjs\" (UniqueName: \"kubernetes.io/projected/6a08f06b-9a80-43ee-9fef-0e722720ce8e-kube-api-access-jwvjs\") pod \"community-operators-w95dc\" (UID: \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\") " pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.529532 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a08f06b-9a80-43ee-9fef-0e722720ce8e-catalog-content\") pod \"community-operators-w95dc\" (UID: \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\") " pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.631970 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwvjs\" (UniqueName: \"kubernetes.io/projected/6a08f06b-9a80-43ee-9fef-0e722720ce8e-kube-api-access-jwvjs\") pod \"community-operators-w95dc\" (UID: \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\") " pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.632091 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a08f06b-9a80-43ee-9fef-0e722720ce8e-catalog-content\") pod \"community-operators-w95dc\" (UID: \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\") " pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.632118 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a08f06b-9a80-43ee-9fef-0e722720ce8e-utilities\") pod \"community-operators-w95dc\" (UID: \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\") " pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.632662 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a08f06b-9a80-43ee-9fef-0e722720ce8e-utilities\") pod \"community-operators-w95dc\" (UID: \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\") " pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.633142 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a08f06b-9a80-43ee-9fef-0e722720ce8e-catalog-content\") pod \"community-operators-w95dc\" (UID: \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\") " pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.668038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwvjs\" (UniqueName: \"kubernetes.io/projected/6a08f06b-9a80-43ee-9fef-0e722720ce8e-kube-api-access-jwvjs\") pod \"community-operators-w95dc\" (UID: \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\") " pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:22 crc kubenswrapper[4749]: I1001 14:20:22.805514 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:23 crc kubenswrapper[4749]: I1001 14:20:23.410663 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w95dc"] Oct 01 14:20:23 crc kubenswrapper[4749]: I1001 14:20:23.893339 4749 generic.go:334] "Generic (PLEG): container finished" podID="6a08f06b-9a80-43ee-9fef-0e722720ce8e" containerID="1080855e50d8a21c0080acc93ce35392eac7665bd715045b0770a08cb59c4177" exitCode=0 Oct 01 14:20:23 crc kubenswrapper[4749]: I1001 14:20:23.893449 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w95dc" event={"ID":"6a08f06b-9a80-43ee-9fef-0e722720ce8e","Type":"ContainerDied","Data":"1080855e50d8a21c0080acc93ce35392eac7665bd715045b0770a08cb59c4177"} Oct 01 14:20:23 crc kubenswrapper[4749]: I1001 14:20:23.893631 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w95dc" event={"ID":"6a08f06b-9a80-43ee-9fef-0e722720ce8e","Type":"ContainerStarted","Data":"175cb610c5024484af679d8953311520d1e319577b991e29f8255cbf8c8401d8"} Oct 01 14:20:23 crc kubenswrapper[4749]: I1001 14:20:23.895262 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:20:24 crc kubenswrapper[4749]: I1001 14:20:24.906669 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w95dc" event={"ID":"6a08f06b-9a80-43ee-9fef-0e722720ce8e","Type":"ContainerStarted","Data":"bc283d9d1973036f129b8229496a13bc5924ce28fa06c8a8ccb6123de429c015"} Oct 01 14:20:25 crc kubenswrapper[4749]: I1001 14:20:25.919985 4749 generic.go:334] "Generic (PLEG): container finished" podID="6a08f06b-9a80-43ee-9fef-0e722720ce8e" containerID="bc283d9d1973036f129b8229496a13bc5924ce28fa06c8a8ccb6123de429c015" exitCode=0 Oct 01 14:20:25 crc kubenswrapper[4749]: I1001 14:20:25.920141 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w95dc" event={"ID":"6a08f06b-9a80-43ee-9fef-0e722720ce8e","Type":"ContainerDied","Data":"bc283d9d1973036f129b8229496a13bc5924ce28fa06c8a8ccb6123de429c015"} Oct 01 14:20:26 crc kubenswrapper[4749]: I1001 14:20:26.943171 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w95dc" event={"ID":"6a08f06b-9a80-43ee-9fef-0e722720ce8e","Type":"ContainerStarted","Data":"c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae"} Oct 01 14:20:26 crc kubenswrapper[4749]: I1001 14:20:26.962909 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w95dc" podStartSLOduration=2.272648835 podStartE2EDuration="4.962890991s" podCreationTimestamp="2025-10-01 14:20:22 +0000 UTC" firstStartedPulling="2025-10-01 14:20:23.895009648 +0000 UTC m=+4483.948994547" lastFinishedPulling="2025-10-01 14:20:26.585251804 +0000 UTC m=+4486.639236703" observedRunningTime="2025-10-01 14:20:26.962562572 +0000 UTC m=+4487.016547511" watchObservedRunningTime="2025-10-01 14:20:26.962890991 +0000 UTC m=+4487.016875890" Oct 01 14:20:28 crc kubenswrapper[4749]: I1001 14:20:28.230786 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:20:28 crc kubenswrapper[4749]: E1001 14:20:28.231320 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:20:32 crc kubenswrapper[4749]: I1001 14:20:32.806689 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:32 crc kubenswrapper[4749]: I1001 14:20:32.807400 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:32 crc kubenswrapper[4749]: I1001 14:20:32.874324 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:33 crc kubenswrapper[4749]: I1001 14:20:33.042688 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:33 crc kubenswrapper[4749]: I1001 14:20:33.107587 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w95dc"] Oct 01 14:20:35 crc kubenswrapper[4749]: I1001 14:20:35.016508 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w95dc" podUID="6a08f06b-9a80-43ee-9fef-0e722720ce8e" containerName="registry-server" containerID="cri-o://c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae" gracePeriod=2 Oct 01 14:20:35 crc kubenswrapper[4749]: I1001 14:20:35.501877 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:35 crc kubenswrapper[4749]: I1001 14:20:35.631378 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwvjs\" (UniqueName: \"kubernetes.io/projected/6a08f06b-9a80-43ee-9fef-0e722720ce8e-kube-api-access-jwvjs\") pod \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\" (UID: \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\") " Oct 01 14:20:35 crc kubenswrapper[4749]: I1001 14:20:35.631483 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a08f06b-9a80-43ee-9fef-0e722720ce8e-utilities\") pod \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\" (UID: \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\") " Oct 01 14:20:35 crc kubenswrapper[4749]: I1001 14:20:35.631538 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a08f06b-9a80-43ee-9fef-0e722720ce8e-catalog-content\") pod \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\" (UID: \"6a08f06b-9a80-43ee-9fef-0e722720ce8e\") " Oct 01 14:20:35 crc kubenswrapper[4749]: I1001 14:20:35.632923 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a08f06b-9a80-43ee-9fef-0e722720ce8e-utilities" (OuterVolumeSpecName: "utilities") pod "6a08f06b-9a80-43ee-9fef-0e722720ce8e" (UID: "6a08f06b-9a80-43ee-9fef-0e722720ce8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:20:35 crc kubenswrapper[4749]: I1001 14:20:35.637893 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a08f06b-9a80-43ee-9fef-0e722720ce8e-kube-api-access-jwvjs" (OuterVolumeSpecName: "kube-api-access-jwvjs") pod "6a08f06b-9a80-43ee-9fef-0e722720ce8e" (UID: "6a08f06b-9a80-43ee-9fef-0e722720ce8e"). InnerVolumeSpecName "kube-api-access-jwvjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:20:35 crc kubenswrapper[4749]: I1001 14:20:35.687567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a08f06b-9a80-43ee-9fef-0e722720ce8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a08f06b-9a80-43ee-9fef-0e722720ce8e" (UID: "6a08f06b-9a80-43ee-9fef-0e722720ce8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:20:35 crc kubenswrapper[4749]: I1001 14:20:35.733762 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwvjs\" (UniqueName: \"kubernetes.io/projected/6a08f06b-9a80-43ee-9fef-0e722720ce8e-kube-api-access-jwvjs\") on node \"crc\" DevicePath \"\"" Oct 01 14:20:35 crc kubenswrapper[4749]: I1001 14:20:35.733810 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a08f06b-9a80-43ee-9fef-0e722720ce8e-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:20:35 crc kubenswrapper[4749]: I1001 14:20:35.733822 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a08f06b-9a80-43ee-9fef-0e722720ce8e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.029600 4749 generic.go:334] "Generic (PLEG): container finished" podID="6a08f06b-9a80-43ee-9fef-0e722720ce8e" containerID="c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae" exitCode=0 Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.029649 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w95dc" event={"ID":"6a08f06b-9a80-43ee-9fef-0e722720ce8e","Type":"ContainerDied","Data":"c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae"} Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.029679 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w95dc" event={"ID":"6a08f06b-9a80-43ee-9fef-0e722720ce8e","Type":"ContainerDied","Data":"175cb610c5024484af679d8953311520d1e319577b991e29f8255cbf8c8401d8"} Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.029702 4749 scope.go:117] "RemoveContainer" containerID="c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae" Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.029733 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w95dc" Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.063922 4749 scope.go:117] "RemoveContainer" containerID="bc283d9d1973036f129b8229496a13bc5924ce28fa06c8a8ccb6123de429c015" Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.066170 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w95dc"] Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.079664 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w95dc"] Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.096553 4749 scope.go:117] "RemoveContainer" containerID="1080855e50d8a21c0080acc93ce35392eac7665bd715045b0770a08cb59c4177" Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.135531 4749 scope.go:117] "RemoveContainer" containerID="c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae" Oct 01 14:20:36 crc kubenswrapper[4749]: E1001 14:20:36.135911 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae\": container with ID starting with c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae not found: ID does not exist" containerID="c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae" Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.135995 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae"} err="failed to get container status \"c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae\": rpc error: code = NotFound desc = could not find container \"c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae\": container with ID starting with c3b27bf3bb0acb0e354267abf7a4b1879a5ac11d3a8ef5a8a6d4662ccca8eeae not found: ID does not exist" Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.136050 4749 scope.go:117] "RemoveContainer" containerID="bc283d9d1973036f129b8229496a13bc5924ce28fa06c8a8ccb6123de429c015" Oct 01 14:20:36 crc kubenswrapper[4749]: E1001 14:20:36.136465 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc283d9d1973036f129b8229496a13bc5924ce28fa06c8a8ccb6123de429c015\": container with ID starting with bc283d9d1973036f129b8229496a13bc5924ce28fa06c8a8ccb6123de429c015 not found: ID does not exist" containerID="bc283d9d1973036f129b8229496a13bc5924ce28fa06c8a8ccb6123de429c015" Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.136490 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc283d9d1973036f129b8229496a13bc5924ce28fa06c8a8ccb6123de429c015"} err="failed to get container status \"bc283d9d1973036f129b8229496a13bc5924ce28fa06c8a8ccb6123de429c015\": rpc error: code = NotFound desc = could not find container \"bc283d9d1973036f129b8229496a13bc5924ce28fa06c8a8ccb6123de429c015\": container with ID starting with bc283d9d1973036f129b8229496a13bc5924ce28fa06c8a8ccb6123de429c015 not found: ID does not exist" Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.136509 4749 scope.go:117] "RemoveContainer" containerID="1080855e50d8a21c0080acc93ce35392eac7665bd715045b0770a08cb59c4177" Oct 01 14:20:36 crc kubenswrapper[4749]: E1001 14:20:36.136786 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1080855e50d8a21c0080acc93ce35392eac7665bd715045b0770a08cb59c4177\": container with ID starting with 1080855e50d8a21c0080acc93ce35392eac7665bd715045b0770a08cb59c4177 not found: ID does not exist" containerID="1080855e50d8a21c0080acc93ce35392eac7665bd715045b0770a08cb59c4177" Oct 01 14:20:36 crc kubenswrapper[4749]: I1001 14:20:36.136821 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1080855e50d8a21c0080acc93ce35392eac7665bd715045b0770a08cb59c4177"} err="failed to get container status \"1080855e50d8a21c0080acc93ce35392eac7665bd715045b0770a08cb59c4177\": rpc error: code = NotFound desc = could not find container \"1080855e50d8a21c0080acc93ce35392eac7665bd715045b0770a08cb59c4177\": container with ID starting with 1080855e50d8a21c0080acc93ce35392eac7665bd715045b0770a08cb59c4177 not found: ID does not exist" Oct 01 14:20:37 crc kubenswrapper[4749]: I1001 14:20:37.244600 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a08f06b-9a80-43ee-9fef-0e722720ce8e" path="/var/lib/kubelet/pods/6a08f06b-9a80-43ee-9fef-0e722720ce8e/volumes" Oct 01 14:20:41 crc kubenswrapper[4749]: I1001 14:20:41.240589 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:20:41 crc kubenswrapper[4749]: E1001 14:20:41.241451 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:20:52 crc kubenswrapper[4749]: I1001 14:20:52.230420 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:20:52 crc kubenswrapper[4749]: E1001 14:20:52.232390 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.355536 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8m85n"] Oct 01 14:21:01 crc kubenswrapper[4749]: E1001 14:21:01.357749 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a08f06b-9a80-43ee-9fef-0e722720ce8e" containerName="extract-content" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.357832 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a08f06b-9a80-43ee-9fef-0e722720ce8e" containerName="extract-content" Oct 01 14:21:01 crc kubenswrapper[4749]: E1001 14:21:01.357908 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a08f06b-9a80-43ee-9fef-0e722720ce8e" containerName="extract-utilities" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.357965 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a08f06b-9a80-43ee-9fef-0e722720ce8e" containerName="extract-utilities" Oct 01 14:21:01 crc kubenswrapper[4749]: E1001 14:21:01.358038 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a08f06b-9a80-43ee-9fef-0e722720ce8e" containerName="registry-server" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.358100 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a08f06b-9a80-43ee-9fef-0e722720ce8e" containerName="registry-server" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.358389 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a08f06b-9a80-43ee-9fef-0e722720ce8e" containerName="registry-server" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.359846 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.370309 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8m85n"] Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.558759 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ff789b-51fc-4e5c-a307-23974e3841dd-utilities\") pod \"redhat-marketplace-8m85n\" (UID: \"b8ff789b-51fc-4e5c-a307-23974e3841dd\") " pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.558998 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ff789b-51fc-4e5c-a307-23974e3841dd-catalog-content\") pod \"redhat-marketplace-8m85n\" (UID: \"b8ff789b-51fc-4e5c-a307-23974e3841dd\") " pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.559385 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59mhk\" (UniqueName: \"kubernetes.io/projected/b8ff789b-51fc-4e5c-a307-23974e3841dd-kube-api-access-59mhk\") pod \"redhat-marketplace-8m85n\" (UID: \"b8ff789b-51fc-4e5c-a307-23974e3841dd\") " pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.661068 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ff789b-51fc-4e5c-a307-23974e3841dd-utilities\") pod \"redhat-marketplace-8m85n\" (UID: \"b8ff789b-51fc-4e5c-a307-23974e3841dd\") " pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.661177 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ff789b-51fc-4e5c-a307-23974e3841dd-catalog-content\") pod \"redhat-marketplace-8m85n\" (UID: \"b8ff789b-51fc-4e5c-a307-23974e3841dd\") " pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.661294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59mhk\" (UniqueName: \"kubernetes.io/projected/b8ff789b-51fc-4e5c-a307-23974e3841dd-kube-api-access-59mhk\") pod \"redhat-marketplace-8m85n\" (UID: \"b8ff789b-51fc-4e5c-a307-23974e3841dd\") " pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.661654 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ff789b-51fc-4e5c-a307-23974e3841dd-utilities\") pod \"redhat-marketplace-8m85n\" (UID: \"b8ff789b-51fc-4e5c-a307-23974e3841dd\") " pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.661754 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ff789b-51fc-4e5c-a307-23974e3841dd-catalog-content\") pod \"redhat-marketplace-8m85n\" (UID: \"b8ff789b-51fc-4e5c-a307-23974e3841dd\") " pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.681374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59mhk\" (UniqueName: \"kubernetes.io/projected/b8ff789b-51fc-4e5c-a307-23974e3841dd-kube-api-access-59mhk\") pod \"redhat-marketplace-8m85n\" (UID: \"b8ff789b-51fc-4e5c-a307-23974e3841dd\") " pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:01 crc kubenswrapper[4749]: I1001 14:21:01.981107 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:02 crc kubenswrapper[4749]: I1001 14:21:02.400105 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8m85n"] Oct 01 14:21:02 crc kubenswrapper[4749]: W1001 14:21:02.402234 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8ff789b_51fc_4e5c_a307_23974e3841dd.slice/crio-5909010bf2eb452bc9a9662f34beb805ff87cfc1130caea32a01bf3e28f2bc7d WatchSource:0}: Error finding container 5909010bf2eb452bc9a9662f34beb805ff87cfc1130caea32a01bf3e28f2bc7d: Status 404 returned error can't find the container with id 5909010bf2eb452bc9a9662f34beb805ff87cfc1130caea32a01bf3e28f2bc7d Oct 01 14:21:03 crc kubenswrapper[4749]: I1001 14:21:03.306660 4749 generic.go:334] "Generic (PLEG): container finished" podID="b8ff789b-51fc-4e5c-a307-23974e3841dd" containerID="9927ebdf0f264cd8653d5c12a2d7ebbf1c9954adbb647cf285057941f8bab256" exitCode=0 Oct 01 14:21:03 crc kubenswrapper[4749]: I1001 14:21:03.307076 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8m85n" event={"ID":"b8ff789b-51fc-4e5c-a307-23974e3841dd","Type":"ContainerDied","Data":"9927ebdf0f264cd8653d5c12a2d7ebbf1c9954adbb647cf285057941f8bab256"} Oct 01 14:21:03 crc kubenswrapper[4749]: I1001 14:21:03.307110 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8m85n" event={"ID":"b8ff789b-51fc-4e5c-a307-23974e3841dd","Type":"ContainerStarted","Data":"5909010bf2eb452bc9a9662f34beb805ff87cfc1130caea32a01bf3e28f2bc7d"} Oct 01 14:21:04 crc kubenswrapper[4749]: I1001 14:21:04.318116 4749 generic.go:334] "Generic (PLEG): container finished" podID="b8ff789b-51fc-4e5c-a307-23974e3841dd" containerID="7138a94b79559aa2fb7e9a2c54d39c41e4f961debf4614dc8bc7d020d2bbbdd2" exitCode=0 Oct 01 14:21:04 crc kubenswrapper[4749]: I1001 14:21:04.318209 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8m85n" event={"ID":"b8ff789b-51fc-4e5c-a307-23974e3841dd","Type":"ContainerDied","Data":"7138a94b79559aa2fb7e9a2c54d39c41e4f961debf4614dc8bc7d020d2bbbdd2"} Oct 01 14:21:05 crc kubenswrapper[4749]: I1001 14:21:05.334503 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8m85n" event={"ID":"b8ff789b-51fc-4e5c-a307-23974e3841dd","Type":"ContainerStarted","Data":"573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0"} Oct 01 14:21:05 crc kubenswrapper[4749]: I1001 14:21:05.364936 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8m85n" podStartSLOduration=2.6029498970000002 podStartE2EDuration="4.364913166s" podCreationTimestamp="2025-10-01 14:21:01 +0000 UTC" firstStartedPulling="2025-10-01 14:21:03.312298854 +0000 UTC m=+4523.366283753" lastFinishedPulling="2025-10-01 14:21:05.074262133 +0000 UTC m=+4525.128247022" observedRunningTime="2025-10-01 14:21:05.35401133 +0000 UTC m=+4525.407996239" watchObservedRunningTime="2025-10-01 14:21:05.364913166 +0000 UTC m=+4525.418898085" Oct 01 14:21:06 crc kubenswrapper[4749]: I1001 14:21:06.229937 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:21:06 crc kubenswrapper[4749]: E1001 14:21:06.230506 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:21:11 crc kubenswrapper[4749]: I1001 14:21:11.981305 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:11 crc kubenswrapper[4749]: I1001 14:21:11.981877 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:12 crc kubenswrapper[4749]: I1001 14:21:12.040154 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:12 crc kubenswrapper[4749]: I1001 14:21:12.496469 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:12 crc kubenswrapper[4749]: I1001 14:21:12.547821 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8m85n"] Oct 01 14:21:14 crc kubenswrapper[4749]: I1001 14:21:14.427771 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8m85n" podUID="b8ff789b-51fc-4e5c-a307-23974e3841dd" containerName="registry-server" containerID="cri-o://573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0" gracePeriod=2 Oct 01 14:21:14 crc kubenswrapper[4749]: I1001 14:21:14.916343 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.027804 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ff789b-51fc-4e5c-a307-23974e3841dd-utilities\") pod \"b8ff789b-51fc-4e5c-a307-23974e3841dd\" (UID: \"b8ff789b-51fc-4e5c-a307-23974e3841dd\") " Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.028107 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59mhk\" (UniqueName: \"kubernetes.io/projected/b8ff789b-51fc-4e5c-a307-23974e3841dd-kube-api-access-59mhk\") pod \"b8ff789b-51fc-4e5c-a307-23974e3841dd\" (UID: \"b8ff789b-51fc-4e5c-a307-23974e3841dd\") " Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.028179 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ff789b-51fc-4e5c-a307-23974e3841dd-catalog-content\") pod \"b8ff789b-51fc-4e5c-a307-23974e3841dd\" (UID: \"b8ff789b-51fc-4e5c-a307-23974e3841dd\") " Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.029085 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ff789b-51fc-4e5c-a307-23974e3841dd-utilities" (OuterVolumeSpecName: "utilities") pod "b8ff789b-51fc-4e5c-a307-23974e3841dd" (UID: "b8ff789b-51fc-4e5c-a307-23974e3841dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.040357 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ff789b-51fc-4e5c-a307-23974e3841dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8ff789b-51fc-4e5c-a307-23974e3841dd" (UID: "b8ff789b-51fc-4e5c-a307-23974e3841dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.131272 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8ff789b-51fc-4e5c-a307-23974e3841dd-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.131313 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8ff789b-51fc-4e5c-a307-23974e3841dd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.439502 4749 generic.go:334] "Generic (PLEG): container finished" podID="b8ff789b-51fc-4e5c-a307-23974e3841dd" containerID="573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0" exitCode=0 Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.439546 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8m85n" event={"ID":"b8ff789b-51fc-4e5c-a307-23974e3841dd","Type":"ContainerDied","Data":"573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0"} Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.439574 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8m85n" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.439614 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8m85n" event={"ID":"b8ff789b-51fc-4e5c-a307-23974e3841dd","Type":"ContainerDied","Data":"5909010bf2eb452bc9a9662f34beb805ff87cfc1130caea32a01bf3e28f2bc7d"} Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.439635 4749 scope.go:117] "RemoveContainer" containerID="573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.458354 4749 scope.go:117] "RemoveContainer" containerID="7138a94b79559aa2fb7e9a2c54d39c41e4f961debf4614dc8bc7d020d2bbbdd2" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.737303 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ff789b-51fc-4e5c-a307-23974e3841dd-kube-api-access-59mhk" (OuterVolumeSpecName: "kube-api-access-59mhk") pod "b8ff789b-51fc-4e5c-a307-23974e3841dd" (UID: "b8ff789b-51fc-4e5c-a307-23974e3841dd"). InnerVolumeSpecName "kube-api-access-59mhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.745384 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59mhk\" (UniqueName: \"kubernetes.io/projected/b8ff789b-51fc-4e5c-a307-23974e3841dd-kube-api-access-59mhk\") on node \"crc\" DevicePath \"\"" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.751203 4749 scope.go:117] "RemoveContainer" containerID="9927ebdf0f264cd8653d5c12a2d7ebbf1c9954adbb647cf285057941f8bab256" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.853299 4749 scope.go:117] "RemoveContainer" containerID="573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0" Oct 01 14:21:15 crc kubenswrapper[4749]: E1001 14:21:15.853730 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0\": container with ID starting with 573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0 not found: ID does not exist" containerID="573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.853773 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0"} err="failed to get container status \"573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0\": rpc error: code = NotFound desc = could not find container \"573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0\": container with ID starting with 573bbafd1b7409a06824074333bf23966c32c0c69eb61a72158a49461062aaa0 not found: ID does not exist" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.853803 4749 scope.go:117] "RemoveContainer" containerID="7138a94b79559aa2fb7e9a2c54d39c41e4f961debf4614dc8bc7d020d2bbbdd2" Oct 01 14:21:15 crc kubenswrapper[4749]: E1001 14:21:15.854100 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7138a94b79559aa2fb7e9a2c54d39c41e4f961debf4614dc8bc7d020d2bbbdd2\": container with ID starting with 7138a94b79559aa2fb7e9a2c54d39c41e4f961debf4614dc8bc7d020d2bbbdd2 not found: ID does not exist" containerID="7138a94b79559aa2fb7e9a2c54d39c41e4f961debf4614dc8bc7d020d2bbbdd2" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.854134 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7138a94b79559aa2fb7e9a2c54d39c41e4f961debf4614dc8bc7d020d2bbbdd2"} err="failed to get container status \"7138a94b79559aa2fb7e9a2c54d39c41e4f961debf4614dc8bc7d020d2bbbdd2\": rpc error: code = NotFound desc = could not find container \"7138a94b79559aa2fb7e9a2c54d39c41e4f961debf4614dc8bc7d020d2bbbdd2\": container with ID starting with 7138a94b79559aa2fb7e9a2c54d39c41e4f961debf4614dc8bc7d020d2bbbdd2 not found: ID does not exist" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.854153 4749 scope.go:117] "RemoveContainer" containerID="9927ebdf0f264cd8653d5c12a2d7ebbf1c9954adbb647cf285057941f8bab256" Oct 01 14:21:15 crc kubenswrapper[4749]: E1001 14:21:15.854365 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9927ebdf0f264cd8653d5c12a2d7ebbf1c9954adbb647cf285057941f8bab256\": container with ID starting with 9927ebdf0f264cd8653d5c12a2d7ebbf1c9954adbb647cf285057941f8bab256 not found: ID does not exist" containerID="9927ebdf0f264cd8653d5c12a2d7ebbf1c9954adbb647cf285057941f8bab256" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.854391 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9927ebdf0f264cd8653d5c12a2d7ebbf1c9954adbb647cf285057941f8bab256"} err="failed to get container status \"9927ebdf0f264cd8653d5c12a2d7ebbf1c9954adbb647cf285057941f8bab256\": rpc error: code = NotFound desc = could not find container \"9927ebdf0f264cd8653d5c12a2d7ebbf1c9954adbb647cf285057941f8bab256\": container with ID starting with 9927ebdf0f264cd8653d5c12a2d7ebbf1c9954adbb647cf285057941f8bab256 not found: ID does not exist" Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.916359 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8m85n"] Oct 01 14:21:15 crc kubenswrapper[4749]: I1001 14:21:15.925671 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8m85n"] Oct 01 14:21:17 crc kubenswrapper[4749]: I1001 14:21:17.262686 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ff789b-51fc-4e5c-a307-23974e3841dd" path="/var/lib/kubelet/pods/b8ff789b-51fc-4e5c-a307-23974e3841dd/volumes" Oct 01 14:21:19 crc kubenswrapper[4749]: I1001 14:21:19.230877 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:21:19 crc kubenswrapper[4749]: E1001 14:21:19.231105 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:21:32 crc kubenswrapper[4749]: I1001 14:21:32.230885 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:21:32 crc kubenswrapper[4749]: E1001 14:21:32.232281 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:21:46 crc kubenswrapper[4749]: I1001 14:21:46.229769 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:21:46 crc kubenswrapper[4749]: E1001 14:21:46.230504 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:22:00 crc kubenswrapper[4749]: I1001 14:22:00.230870 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:22:00 crc kubenswrapper[4749]: E1001 14:22:00.231588 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.273501 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9mq5b"] Oct 01 14:22:04 crc kubenswrapper[4749]: E1001 14:22:04.274355 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ff789b-51fc-4e5c-a307-23974e3841dd" containerName="extract-content" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.274368 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ff789b-51fc-4e5c-a307-23974e3841dd" containerName="extract-content" Oct 01 14:22:04 crc kubenswrapper[4749]: E1001 14:22:04.274402 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ff789b-51fc-4e5c-a307-23974e3841dd" containerName="registry-server" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.274408 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ff789b-51fc-4e5c-a307-23974e3841dd" containerName="registry-server" Oct 01 14:22:04 crc kubenswrapper[4749]: E1001 14:22:04.274426 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ff789b-51fc-4e5c-a307-23974e3841dd" containerName="extract-utilities" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.274432 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ff789b-51fc-4e5c-a307-23974e3841dd" containerName="extract-utilities" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.274619 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ff789b-51fc-4e5c-a307-23974e3841dd" containerName="registry-server" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.276007 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.292032 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9mq5b"] Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.417713 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbvv8\" (UniqueName: \"kubernetes.io/projected/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-kube-api-access-fbvv8\") pod \"redhat-operators-9mq5b\" (UID: \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\") " pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.418076 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-utilities\") pod \"redhat-operators-9mq5b\" (UID: \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\") " pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.418177 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-catalog-content\") pod \"redhat-operators-9mq5b\" (UID: \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\") " pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.520234 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-utilities\") pod \"redhat-operators-9mq5b\" (UID: \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\") " pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.520294 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-catalog-content\") pod \"redhat-operators-9mq5b\" (UID: \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\") " pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.520369 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbvv8\" (UniqueName: \"kubernetes.io/projected/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-kube-api-access-fbvv8\") pod \"redhat-operators-9mq5b\" (UID: \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\") " pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.520974 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-utilities\") pod \"redhat-operators-9mq5b\" (UID: \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\") " pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.521186 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-catalog-content\") pod \"redhat-operators-9mq5b\" (UID: \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\") " pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.542221 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbvv8\" (UniqueName: \"kubernetes.io/projected/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-kube-api-access-fbvv8\") pod \"redhat-operators-9mq5b\" (UID: \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\") " pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:04 crc kubenswrapper[4749]: I1001 14:22:04.599918 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:05 crc kubenswrapper[4749]: W1001 14:22:05.111330 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b7d9be0_021b_4c1c_b509_2c19edb7dce4.slice/crio-745a5e9f4823ee852deb13c9064af863630cb010ac87eca88485018fe953bfd5 WatchSource:0}: Error finding container 745a5e9f4823ee852deb13c9064af863630cb010ac87eca88485018fe953bfd5: Status 404 returned error can't find the container with id 745a5e9f4823ee852deb13c9064af863630cb010ac87eca88485018fe953bfd5 Oct 01 14:22:05 crc kubenswrapper[4749]: I1001 14:22:05.113172 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9mq5b"] Oct 01 14:22:05 crc kubenswrapper[4749]: I1001 14:22:05.967568 4749 generic.go:334] "Generic (PLEG): container finished" podID="3b7d9be0-021b-4c1c-b509-2c19edb7dce4" containerID="0f8ba035ca022c52a664235f6a15f0b188a47416ca57ab1f5a69ac8ac9292a3b" exitCode=0 Oct 01 14:22:05 crc kubenswrapper[4749]: I1001 14:22:05.967663 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mq5b" event={"ID":"3b7d9be0-021b-4c1c-b509-2c19edb7dce4","Type":"ContainerDied","Data":"0f8ba035ca022c52a664235f6a15f0b188a47416ca57ab1f5a69ac8ac9292a3b"} Oct 01 14:22:05 crc kubenswrapper[4749]: I1001 14:22:05.968151 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mq5b" event={"ID":"3b7d9be0-021b-4c1c-b509-2c19edb7dce4","Type":"ContainerStarted","Data":"745a5e9f4823ee852deb13c9064af863630cb010ac87eca88485018fe953bfd5"} Oct 01 14:22:06 crc kubenswrapper[4749]: I1001 14:22:06.980681 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mq5b" event={"ID":"3b7d9be0-021b-4c1c-b509-2c19edb7dce4","Type":"ContainerStarted","Data":"3b3b8b50d7536ddc248415c39a79dcabddf7b1f15e0f544e0697559e87e01e8d"} Oct 01 14:22:07 crc kubenswrapper[4749]: I1001 14:22:07.991248 4749 generic.go:334] "Generic (PLEG): container finished" podID="3b7d9be0-021b-4c1c-b509-2c19edb7dce4" containerID="3b3b8b50d7536ddc248415c39a79dcabddf7b1f15e0f544e0697559e87e01e8d" exitCode=0 Oct 01 14:22:07 crc kubenswrapper[4749]: I1001 14:22:07.991301 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mq5b" event={"ID":"3b7d9be0-021b-4c1c-b509-2c19edb7dce4","Type":"ContainerDied","Data":"3b3b8b50d7536ddc248415c39a79dcabddf7b1f15e0f544e0697559e87e01e8d"} Oct 01 14:22:10 crc kubenswrapper[4749]: I1001 14:22:10.009973 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mq5b" event={"ID":"3b7d9be0-021b-4c1c-b509-2c19edb7dce4","Type":"ContainerStarted","Data":"ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68"} Oct 01 14:22:10 crc kubenswrapper[4749]: I1001 14:22:10.027037 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9mq5b" podStartSLOduration=3.5466110459999998 podStartE2EDuration="6.027018446s" podCreationTimestamp="2025-10-01 14:22:04 +0000 UTC" firstStartedPulling="2025-10-01 14:22:05.969650037 +0000 UTC m=+4586.023634936" lastFinishedPulling="2025-10-01 14:22:08.450057437 +0000 UTC m=+4588.504042336" observedRunningTime="2025-10-01 14:22:10.025844462 +0000 UTC m=+4590.079829381" watchObservedRunningTime="2025-10-01 14:22:10.027018446 +0000 UTC m=+4590.081003375" Oct 01 14:22:14 crc kubenswrapper[4749]: I1001 14:22:14.230202 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:22:14 crc kubenswrapper[4749]: I1001 14:22:14.601679 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:14 crc kubenswrapper[4749]: I1001 14:22:14.602031 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:14 crc kubenswrapper[4749]: I1001 14:22:14.663362 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:15 crc kubenswrapper[4749]: I1001 14:22:15.060702 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"e9291d236f031c64711e4ca8349b7c3586c9a37e92696e8dae2d5cc20ce6c9e9"} Oct 01 14:22:15 crc kubenswrapper[4749]: I1001 14:22:15.120978 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:15 crc kubenswrapper[4749]: I1001 14:22:15.175969 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9mq5b"] Oct 01 14:22:17 crc kubenswrapper[4749]: I1001 14:22:17.077921 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9mq5b" podUID="3b7d9be0-021b-4c1c-b509-2c19edb7dce4" containerName="registry-server" containerID="cri-o://ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68" gracePeriod=2 Oct 01 14:22:17 crc kubenswrapper[4749]: I1001 14:22:17.530755 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:17 crc kubenswrapper[4749]: I1001 14:22:17.574453 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbvv8\" (UniqueName: \"kubernetes.io/projected/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-kube-api-access-fbvv8\") pod \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\" (UID: \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\") " Oct 01 14:22:17 crc kubenswrapper[4749]: I1001 14:22:17.574530 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-utilities\") pod \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\" (UID: \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\") " Oct 01 14:22:17 crc kubenswrapper[4749]: I1001 14:22:17.574809 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-catalog-content\") pod \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\" (UID: \"3b7d9be0-021b-4c1c-b509-2c19edb7dce4\") " Oct 01 14:22:17 crc kubenswrapper[4749]: I1001 14:22:17.575630 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-utilities" (OuterVolumeSpecName: "utilities") pod "3b7d9be0-021b-4c1c-b509-2c19edb7dce4" (UID: "3b7d9be0-021b-4c1c-b509-2c19edb7dce4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:22:17 crc kubenswrapper[4749]: I1001 14:22:17.580377 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-kube-api-access-fbvv8" (OuterVolumeSpecName: "kube-api-access-fbvv8") pod "3b7d9be0-021b-4c1c-b509-2c19edb7dce4" (UID: "3b7d9be0-021b-4c1c-b509-2c19edb7dce4"). InnerVolumeSpecName "kube-api-access-fbvv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:22:17 crc kubenswrapper[4749]: I1001 14:22:17.666430 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b7d9be0-021b-4c1c-b509-2c19edb7dce4" (UID: "3b7d9be0-021b-4c1c-b509-2c19edb7dce4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:22:17 crc kubenswrapper[4749]: I1001 14:22:17.677177 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:22:17 crc kubenswrapper[4749]: I1001 14:22:17.677236 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbvv8\" (UniqueName: \"kubernetes.io/projected/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-kube-api-access-fbvv8\") on node \"crc\" DevicePath \"\"" Oct 01 14:22:17 crc kubenswrapper[4749]: I1001 14:22:17.677247 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b7d9be0-021b-4c1c-b509-2c19edb7dce4-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.088919 4749 generic.go:334] "Generic (PLEG): container finished" podID="3b7d9be0-021b-4c1c-b509-2c19edb7dce4" containerID="ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68" exitCode=0 Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.088965 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9mq5b" Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.088968 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mq5b" event={"ID":"3b7d9be0-021b-4c1c-b509-2c19edb7dce4","Type":"ContainerDied","Data":"ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68"} Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.089067 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9mq5b" event={"ID":"3b7d9be0-021b-4c1c-b509-2c19edb7dce4","Type":"ContainerDied","Data":"745a5e9f4823ee852deb13c9064af863630cb010ac87eca88485018fe953bfd5"} Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.089090 4749 scope.go:117] "RemoveContainer" containerID="ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68" Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.110681 4749 scope.go:117] "RemoveContainer" containerID="3b3b8b50d7536ddc248415c39a79dcabddf7b1f15e0f544e0697559e87e01e8d" Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.132447 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9mq5b"] Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.146965 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9mq5b"] Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.151276 4749 scope.go:117] "RemoveContainer" containerID="0f8ba035ca022c52a664235f6a15f0b188a47416ca57ab1f5a69ac8ac9292a3b" Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.188515 4749 scope.go:117] "RemoveContainer" containerID="ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68" Oct 01 14:22:18 crc kubenswrapper[4749]: E1001 14:22:18.189065 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68\": container with ID starting with ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68 not found: ID does not exist" containerID="ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68" Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.189117 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68"} err="failed to get container status \"ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68\": rpc error: code = NotFound desc = could not find container \"ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68\": container with ID starting with ab0b01af9bcff24337f051c48f20b53f9b12a537f6a227fe067121bebdb63b68 not found: ID does not exist" Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.189149 4749 scope.go:117] "RemoveContainer" containerID="3b3b8b50d7536ddc248415c39a79dcabddf7b1f15e0f544e0697559e87e01e8d" Oct 01 14:22:18 crc kubenswrapper[4749]: E1001 14:22:18.189695 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b3b8b50d7536ddc248415c39a79dcabddf7b1f15e0f544e0697559e87e01e8d\": container with ID starting with 3b3b8b50d7536ddc248415c39a79dcabddf7b1f15e0f544e0697559e87e01e8d not found: ID does not exist" containerID="3b3b8b50d7536ddc248415c39a79dcabddf7b1f15e0f544e0697559e87e01e8d" Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.189802 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b3b8b50d7536ddc248415c39a79dcabddf7b1f15e0f544e0697559e87e01e8d"} err="failed to get container status \"3b3b8b50d7536ddc248415c39a79dcabddf7b1f15e0f544e0697559e87e01e8d\": rpc error: code = NotFound desc = could not find container \"3b3b8b50d7536ddc248415c39a79dcabddf7b1f15e0f544e0697559e87e01e8d\": container with ID starting with 3b3b8b50d7536ddc248415c39a79dcabddf7b1f15e0f544e0697559e87e01e8d not found: ID does not exist" Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.189881 4749 scope.go:117] "RemoveContainer" containerID="0f8ba035ca022c52a664235f6a15f0b188a47416ca57ab1f5a69ac8ac9292a3b" Oct 01 14:22:18 crc kubenswrapper[4749]: E1001 14:22:18.190504 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f8ba035ca022c52a664235f6a15f0b188a47416ca57ab1f5a69ac8ac9292a3b\": container with ID starting with 0f8ba035ca022c52a664235f6a15f0b188a47416ca57ab1f5a69ac8ac9292a3b not found: ID does not exist" containerID="0f8ba035ca022c52a664235f6a15f0b188a47416ca57ab1f5a69ac8ac9292a3b" Oct 01 14:22:18 crc kubenswrapper[4749]: I1001 14:22:18.190550 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f8ba035ca022c52a664235f6a15f0b188a47416ca57ab1f5a69ac8ac9292a3b"} err="failed to get container status \"0f8ba035ca022c52a664235f6a15f0b188a47416ca57ab1f5a69ac8ac9292a3b\": rpc error: code = NotFound desc = could not find container \"0f8ba035ca022c52a664235f6a15f0b188a47416ca57ab1f5a69ac8ac9292a3b\": container with ID starting with 0f8ba035ca022c52a664235f6a15f0b188a47416ca57ab1f5a69ac8ac9292a3b not found: ID does not exist" Oct 01 14:22:19 crc kubenswrapper[4749]: I1001 14:22:19.246122 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b7d9be0-021b-4c1c-b509-2c19edb7dce4" path="/var/lib/kubelet/pods/3b7d9be0-021b-4c1c-b509-2c19edb7dce4/volumes" Oct 01 14:23:40 crc kubenswrapper[4749]: E1001 14:23:40.642614 4749 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.220:58422->38.102.83.220:34693: read tcp 38.102.83.220:58422->38.102.83.220:34693: read: connection reset by peer Oct 01 14:24:32 crc kubenswrapper[4749]: I1001 14:24:32.106742 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:24:32 crc kubenswrapper[4749]: I1001 14:24:32.107258 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:25:02 crc kubenswrapper[4749]: I1001 14:25:02.106301 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:25:02 crc kubenswrapper[4749]: I1001 14:25:02.106928 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.390848 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ckdkz"] Oct 01 14:25:09 crc kubenswrapper[4749]: E1001 14:25:09.391833 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7d9be0-021b-4c1c-b509-2c19edb7dce4" containerName="registry-server" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.391850 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7d9be0-021b-4c1c-b509-2c19edb7dce4" containerName="registry-server" Oct 01 14:25:09 crc kubenswrapper[4749]: E1001 14:25:09.391872 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7d9be0-021b-4c1c-b509-2c19edb7dce4" containerName="extract-content" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.391904 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7d9be0-021b-4c1c-b509-2c19edb7dce4" containerName="extract-content" Oct 01 14:25:09 crc kubenswrapper[4749]: E1001 14:25:09.391920 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b7d9be0-021b-4c1c-b509-2c19edb7dce4" containerName="extract-utilities" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.391928 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b7d9be0-021b-4c1c-b509-2c19edb7dce4" containerName="extract-utilities" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.392117 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b7d9be0-021b-4c1c-b509-2c19edb7dce4" containerName="registry-server" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.393620 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.406932 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ckdkz"] Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.515291 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1140b72f-8f5e-4556-bb03-7897ad7f100c-catalog-content\") pod \"certified-operators-ckdkz\" (UID: \"1140b72f-8f5e-4556-bb03-7897ad7f100c\") " pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.515458 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1140b72f-8f5e-4556-bb03-7897ad7f100c-utilities\") pod \"certified-operators-ckdkz\" (UID: \"1140b72f-8f5e-4556-bb03-7897ad7f100c\") " pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.515517 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvwqx\" (UniqueName: \"kubernetes.io/projected/1140b72f-8f5e-4556-bb03-7897ad7f100c-kube-api-access-vvwqx\") pod \"certified-operators-ckdkz\" (UID: \"1140b72f-8f5e-4556-bb03-7897ad7f100c\") " pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.617572 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1140b72f-8f5e-4556-bb03-7897ad7f100c-catalog-content\") pod \"certified-operators-ckdkz\" (UID: \"1140b72f-8f5e-4556-bb03-7897ad7f100c\") " pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.617725 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1140b72f-8f5e-4556-bb03-7897ad7f100c-utilities\") pod \"certified-operators-ckdkz\" (UID: \"1140b72f-8f5e-4556-bb03-7897ad7f100c\") " pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.617795 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvwqx\" (UniqueName: \"kubernetes.io/projected/1140b72f-8f5e-4556-bb03-7897ad7f100c-kube-api-access-vvwqx\") pod \"certified-operators-ckdkz\" (UID: \"1140b72f-8f5e-4556-bb03-7897ad7f100c\") " pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.618013 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1140b72f-8f5e-4556-bb03-7897ad7f100c-catalog-content\") pod \"certified-operators-ckdkz\" (UID: \"1140b72f-8f5e-4556-bb03-7897ad7f100c\") " pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.618158 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1140b72f-8f5e-4556-bb03-7897ad7f100c-utilities\") pod \"certified-operators-ckdkz\" (UID: \"1140b72f-8f5e-4556-bb03-7897ad7f100c\") " pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.638822 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvwqx\" (UniqueName: \"kubernetes.io/projected/1140b72f-8f5e-4556-bb03-7897ad7f100c-kube-api-access-vvwqx\") pod \"certified-operators-ckdkz\" (UID: \"1140b72f-8f5e-4556-bb03-7897ad7f100c\") " pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:09 crc kubenswrapper[4749]: I1001 14:25:09.715172 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:10 crc kubenswrapper[4749]: I1001 14:25:10.340425 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ckdkz"] Oct 01 14:25:11 crc kubenswrapper[4749]: I1001 14:25:11.087552 4749 generic.go:334] "Generic (PLEG): container finished" podID="1140b72f-8f5e-4556-bb03-7897ad7f100c" containerID="f80a3cb682e2e46eaebf11cedb2353549c9f94cc55f54e3e6b8436d2a05b89a6" exitCode=0 Oct 01 14:25:11 crc kubenswrapper[4749]: I1001 14:25:11.087603 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ckdkz" event={"ID":"1140b72f-8f5e-4556-bb03-7897ad7f100c","Type":"ContainerDied","Data":"f80a3cb682e2e46eaebf11cedb2353549c9f94cc55f54e3e6b8436d2a05b89a6"} Oct 01 14:25:11 crc kubenswrapper[4749]: I1001 14:25:11.087636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ckdkz" event={"ID":"1140b72f-8f5e-4556-bb03-7897ad7f100c","Type":"ContainerStarted","Data":"26310381b6df0f2f6d6d23221aa07bbd5a51d4a500af068235e99d29b2c4e72d"} Oct 01 14:25:12 crc kubenswrapper[4749]: I1001 14:25:12.099463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ckdkz" event={"ID":"1140b72f-8f5e-4556-bb03-7897ad7f100c","Type":"ContainerStarted","Data":"4e7c59d06bfc4e5cae34227814f42268cee3342bede5c9ba5af69ba40d524e9e"} Oct 01 14:25:14 crc kubenswrapper[4749]: I1001 14:25:14.127360 4749 generic.go:334] "Generic (PLEG): container finished" podID="1140b72f-8f5e-4556-bb03-7897ad7f100c" containerID="4e7c59d06bfc4e5cae34227814f42268cee3342bede5c9ba5af69ba40d524e9e" exitCode=0 Oct 01 14:25:14 crc kubenswrapper[4749]: I1001 14:25:14.127456 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ckdkz" event={"ID":"1140b72f-8f5e-4556-bb03-7897ad7f100c","Type":"ContainerDied","Data":"4e7c59d06bfc4e5cae34227814f42268cee3342bede5c9ba5af69ba40d524e9e"} Oct 01 14:25:15 crc kubenswrapper[4749]: I1001 14:25:15.139648 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ckdkz" event={"ID":"1140b72f-8f5e-4556-bb03-7897ad7f100c","Type":"ContainerStarted","Data":"13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb"} Oct 01 14:25:15 crc kubenswrapper[4749]: I1001 14:25:15.156242 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ckdkz" podStartSLOduration=2.560967368 podStartE2EDuration="6.156208316s" podCreationTimestamp="2025-10-01 14:25:09 +0000 UTC" firstStartedPulling="2025-10-01 14:25:11.090607452 +0000 UTC m=+4771.144592361" lastFinishedPulling="2025-10-01 14:25:14.68584841 +0000 UTC m=+4774.739833309" observedRunningTime="2025-10-01 14:25:15.156189775 +0000 UTC m=+4775.210174684" watchObservedRunningTime="2025-10-01 14:25:15.156208316 +0000 UTC m=+4775.210193215" Oct 01 14:25:19 crc kubenswrapper[4749]: I1001 14:25:19.715919 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:19 crc kubenswrapper[4749]: I1001 14:25:19.716469 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:19 crc kubenswrapper[4749]: I1001 14:25:19.772250 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:20 crc kubenswrapper[4749]: I1001 14:25:20.512680 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:20 crc kubenswrapper[4749]: I1001 14:25:20.585910 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ckdkz"] Oct 01 14:25:22 crc kubenswrapper[4749]: I1001 14:25:22.214294 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ckdkz" podUID="1140b72f-8f5e-4556-bb03-7897ad7f100c" containerName="registry-server" containerID="cri-o://13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb" gracePeriod=2 Oct 01 14:25:22 crc kubenswrapper[4749]: I1001 14:25:22.963444 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.115266 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1140b72f-8f5e-4556-bb03-7897ad7f100c-utilities\") pod \"1140b72f-8f5e-4556-bb03-7897ad7f100c\" (UID: \"1140b72f-8f5e-4556-bb03-7897ad7f100c\") " Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.115592 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1140b72f-8f5e-4556-bb03-7897ad7f100c-catalog-content\") pod \"1140b72f-8f5e-4556-bb03-7897ad7f100c\" (UID: \"1140b72f-8f5e-4556-bb03-7897ad7f100c\") " Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.115926 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvwqx\" (UniqueName: \"kubernetes.io/projected/1140b72f-8f5e-4556-bb03-7897ad7f100c-kube-api-access-vvwqx\") pod \"1140b72f-8f5e-4556-bb03-7897ad7f100c\" (UID: \"1140b72f-8f5e-4556-bb03-7897ad7f100c\") " Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.116361 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1140b72f-8f5e-4556-bb03-7897ad7f100c-utilities" (OuterVolumeSpecName: "utilities") pod "1140b72f-8f5e-4556-bb03-7897ad7f100c" (UID: "1140b72f-8f5e-4556-bb03-7897ad7f100c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.116992 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1140b72f-8f5e-4556-bb03-7897ad7f100c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.122412 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1140b72f-8f5e-4556-bb03-7897ad7f100c-kube-api-access-vvwqx" (OuterVolumeSpecName: "kube-api-access-vvwqx") pod "1140b72f-8f5e-4556-bb03-7897ad7f100c" (UID: "1140b72f-8f5e-4556-bb03-7897ad7f100c"). InnerVolumeSpecName "kube-api-access-vvwqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.197655 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1140b72f-8f5e-4556-bb03-7897ad7f100c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1140b72f-8f5e-4556-bb03-7897ad7f100c" (UID: "1140b72f-8f5e-4556-bb03-7897ad7f100c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.219409 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvwqx\" (UniqueName: \"kubernetes.io/projected/1140b72f-8f5e-4556-bb03-7897ad7f100c-kube-api-access-vvwqx\") on node \"crc\" DevicePath \"\"" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.219443 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1140b72f-8f5e-4556-bb03-7897ad7f100c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.227562 4749 generic.go:334] "Generic (PLEG): container finished" podID="1140b72f-8f5e-4556-bb03-7897ad7f100c" containerID="13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb" exitCode=0 Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.227624 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ckdkz" event={"ID":"1140b72f-8f5e-4556-bb03-7897ad7f100c","Type":"ContainerDied","Data":"13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb"} Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.227664 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ckdkz" event={"ID":"1140b72f-8f5e-4556-bb03-7897ad7f100c","Type":"ContainerDied","Data":"26310381b6df0f2f6d6d23221aa07bbd5a51d4a500af068235e99d29b2c4e72d"} Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.227689 4749 scope.go:117] "RemoveContainer" containerID="13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.227688 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ckdkz" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.272156 4749 scope.go:117] "RemoveContainer" containerID="4e7c59d06bfc4e5cae34227814f42268cee3342bede5c9ba5af69ba40d524e9e" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.282372 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ckdkz"] Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.293691 4749 scope.go:117] "RemoveContainer" containerID="f80a3cb682e2e46eaebf11cedb2353549c9f94cc55f54e3e6b8436d2a05b89a6" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.294545 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ckdkz"] Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.341705 4749 scope.go:117] "RemoveContainer" containerID="13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb" Oct 01 14:25:23 crc kubenswrapper[4749]: E1001 14:25:23.342652 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb\": container with ID starting with 13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb not found: ID does not exist" containerID="13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.342761 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb"} err="failed to get container status \"13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb\": rpc error: code = NotFound desc = could not find container \"13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb\": container with ID starting with 13fd275c22c85a738dec37f01246f63153842319a3b5dd9971202f0808dd8ebb not found: ID does not exist" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.342794 4749 scope.go:117] "RemoveContainer" containerID="4e7c59d06bfc4e5cae34227814f42268cee3342bede5c9ba5af69ba40d524e9e" Oct 01 14:25:23 crc kubenswrapper[4749]: E1001 14:25:23.343165 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e7c59d06bfc4e5cae34227814f42268cee3342bede5c9ba5af69ba40d524e9e\": container with ID starting with 4e7c59d06bfc4e5cae34227814f42268cee3342bede5c9ba5af69ba40d524e9e not found: ID does not exist" containerID="4e7c59d06bfc4e5cae34227814f42268cee3342bede5c9ba5af69ba40d524e9e" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.343247 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e7c59d06bfc4e5cae34227814f42268cee3342bede5c9ba5af69ba40d524e9e"} err="failed to get container status \"4e7c59d06bfc4e5cae34227814f42268cee3342bede5c9ba5af69ba40d524e9e\": rpc error: code = NotFound desc = could not find container \"4e7c59d06bfc4e5cae34227814f42268cee3342bede5c9ba5af69ba40d524e9e\": container with ID starting with 4e7c59d06bfc4e5cae34227814f42268cee3342bede5c9ba5af69ba40d524e9e not found: ID does not exist" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.343298 4749 scope.go:117] "RemoveContainer" containerID="f80a3cb682e2e46eaebf11cedb2353549c9f94cc55f54e3e6b8436d2a05b89a6" Oct 01 14:25:23 crc kubenswrapper[4749]: E1001 14:25:23.343588 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80a3cb682e2e46eaebf11cedb2353549c9f94cc55f54e3e6b8436d2a05b89a6\": container with ID starting with f80a3cb682e2e46eaebf11cedb2353549c9f94cc55f54e3e6b8436d2a05b89a6 not found: ID does not exist" containerID="f80a3cb682e2e46eaebf11cedb2353549c9f94cc55f54e3e6b8436d2a05b89a6" Oct 01 14:25:23 crc kubenswrapper[4749]: I1001 14:25:23.343618 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80a3cb682e2e46eaebf11cedb2353549c9f94cc55f54e3e6b8436d2a05b89a6"} err="failed to get container status \"f80a3cb682e2e46eaebf11cedb2353549c9f94cc55f54e3e6b8436d2a05b89a6\": rpc error: code = NotFound desc = could not find container \"f80a3cb682e2e46eaebf11cedb2353549c9f94cc55f54e3e6b8436d2a05b89a6\": container with ID starting with f80a3cb682e2e46eaebf11cedb2353549c9f94cc55f54e3e6b8436d2a05b89a6 not found: ID does not exist" Oct 01 14:25:25 crc kubenswrapper[4749]: I1001 14:25:25.246952 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1140b72f-8f5e-4556-bb03-7897ad7f100c" path="/var/lib/kubelet/pods/1140b72f-8f5e-4556-bb03-7897ad7f100c/volumes" Oct 01 14:25:32 crc kubenswrapper[4749]: I1001 14:25:32.106773 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:25:32 crc kubenswrapper[4749]: I1001 14:25:32.107506 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:25:32 crc kubenswrapper[4749]: I1001 14:25:32.107577 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 14:25:32 crc kubenswrapper[4749]: I1001 14:25:32.110383 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9291d236f031c64711e4ca8349b7c3586c9a37e92696e8dae2d5cc20ce6c9e9"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:25:32 crc kubenswrapper[4749]: I1001 14:25:32.110507 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://e9291d236f031c64711e4ca8349b7c3586c9a37e92696e8dae2d5cc20ce6c9e9" gracePeriod=600 Oct 01 14:25:33 crc kubenswrapper[4749]: I1001 14:25:33.341924 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="e9291d236f031c64711e4ca8349b7c3586c9a37e92696e8dae2d5cc20ce6c9e9" exitCode=0 Oct 01 14:25:33 crc kubenswrapper[4749]: I1001 14:25:33.342002 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"e9291d236f031c64711e4ca8349b7c3586c9a37e92696e8dae2d5cc20ce6c9e9"} Oct 01 14:25:33 crc kubenswrapper[4749]: I1001 14:25:33.343074 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42"} Oct 01 14:25:33 crc kubenswrapper[4749]: I1001 14:25:33.343112 4749 scope.go:117] "RemoveContainer" containerID="25ad421cab9139f630c69e1d96851b0494a591d317c79ae8e8372d713c4cb404" Oct 01 14:27:32 crc kubenswrapper[4749]: I1001 14:27:32.106543 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:27:32 crc kubenswrapper[4749]: I1001 14:27:32.107057 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:28:02 crc kubenswrapper[4749]: I1001 14:28:02.107342 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:28:02 crc kubenswrapper[4749]: I1001 14:28:02.108651 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:28:32 crc kubenswrapper[4749]: I1001 14:28:32.106375 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:28:32 crc kubenswrapper[4749]: I1001 14:28:32.106970 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:28:32 crc kubenswrapper[4749]: I1001 14:28:32.107027 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 14:28:32 crc kubenswrapper[4749]: I1001 14:28:32.107832 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:28:32 crc kubenswrapper[4749]: I1001 14:28:32.107883 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" gracePeriod=600 Oct 01 14:28:32 crc kubenswrapper[4749]: E1001 14:28:32.232329 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:28:32 crc kubenswrapper[4749]: I1001 14:28:32.274359 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" exitCode=0 Oct 01 14:28:32 crc kubenswrapper[4749]: I1001 14:28:32.274400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42"} Oct 01 14:28:32 crc kubenswrapper[4749]: I1001 14:28:32.274431 4749 scope.go:117] "RemoveContainer" containerID="e9291d236f031c64711e4ca8349b7c3586c9a37e92696e8dae2d5cc20ce6c9e9" Oct 01 14:28:32 crc kubenswrapper[4749]: I1001 14:28:32.275126 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:28:32 crc kubenswrapper[4749]: E1001 14:28:32.275408 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:28:47 crc kubenswrapper[4749]: I1001 14:28:47.230493 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:28:47 crc kubenswrapper[4749]: E1001 14:28:47.231736 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:29:01 crc kubenswrapper[4749]: I1001 14:29:01.237064 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:29:01 crc kubenswrapper[4749]: E1001 14:29:01.237989 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:29:14 crc kubenswrapper[4749]: I1001 14:29:14.230080 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:29:14 crc kubenswrapper[4749]: E1001 14:29:14.231050 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:29:29 crc kubenswrapper[4749]: I1001 14:29:29.231674 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:29:29 crc kubenswrapper[4749]: E1001 14:29:29.233078 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:29:44 crc kubenswrapper[4749]: I1001 14:29:44.230212 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:29:44 crc kubenswrapper[4749]: E1001 14:29:44.231455 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:29:59 crc kubenswrapper[4749]: I1001 14:29:59.231731 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:29:59 crc kubenswrapper[4749]: E1001 14:29:59.232868 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.168187 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6"] Oct 01 14:30:00 crc kubenswrapper[4749]: E1001 14:30:00.168707 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1140b72f-8f5e-4556-bb03-7897ad7f100c" containerName="extract-utilities" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.168723 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1140b72f-8f5e-4556-bb03-7897ad7f100c" containerName="extract-utilities" Oct 01 14:30:00 crc kubenswrapper[4749]: E1001 14:30:00.168770 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1140b72f-8f5e-4556-bb03-7897ad7f100c" containerName="registry-server" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.168778 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1140b72f-8f5e-4556-bb03-7897ad7f100c" containerName="registry-server" Oct 01 14:30:00 crc kubenswrapper[4749]: E1001 14:30:00.168803 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1140b72f-8f5e-4556-bb03-7897ad7f100c" containerName="extract-content" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.168811 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1140b72f-8f5e-4556-bb03-7897ad7f100c" containerName="extract-content" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.169049 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1140b72f-8f5e-4556-bb03-7897ad7f100c" containerName="registry-server" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.169849 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.172302 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.172828 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.187398 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6"] Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.261938 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/463a37f2-e2a2-4385-98bb-d00549dfd5a2-config-volume\") pod \"collect-profiles-29322150-x8fv6\" (UID: \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.262014 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcqqt\" (UniqueName: \"kubernetes.io/projected/463a37f2-e2a2-4385-98bb-d00549dfd5a2-kube-api-access-gcqqt\") pod \"collect-profiles-29322150-x8fv6\" (UID: \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.262099 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/463a37f2-e2a2-4385-98bb-d00549dfd5a2-secret-volume\") pod \"collect-profiles-29322150-x8fv6\" (UID: \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.364212 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/463a37f2-e2a2-4385-98bb-d00549dfd5a2-config-volume\") pod \"collect-profiles-29322150-x8fv6\" (UID: \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.364335 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcqqt\" (UniqueName: \"kubernetes.io/projected/463a37f2-e2a2-4385-98bb-d00549dfd5a2-kube-api-access-gcqqt\") pod \"collect-profiles-29322150-x8fv6\" (UID: \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.365483 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/463a37f2-e2a2-4385-98bb-d00549dfd5a2-config-volume\") pod \"collect-profiles-29322150-x8fv6\" (UID: \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.370533 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/463a37f2-e2a2-4385-98bb-d00549dfd5a2-secret-volume\") pod \"collect-profiles-29322150-x8fv6\" (UID: \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.382346 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/463a37f2-e2a2-4385-98bb-d00549dfd5a2-secret-volume\") pod \"collect-profiles-29322150-x8fv6\" (UID: \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.385896 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcqqt\" (UniqueName: \"kubernetes.io/projected/463a37f2-e2a2-4385-98bb-d00549dfd5a2-kube-api-access-gcqqt\") pod \"collect-profiles-29322150-x8fv6\" (UID: \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.510943 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:00 crc kubenswrapper[4749]: I1001 14:30:00.989492 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6"] Oct 01 14:30:01 crc kubenswrapper[4749]: I1001 14:30:01.282068 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" event={"ID":"463a37f2-e2a2-4385-98bb-d00549dfd5a2","Type":"ContainerStarted","Data":"88f10a27c49a343edbbc9fb9295c81dce8e8a591d430e7babea44d198111a361"} Oct 01 14:30:02 crc kubenswrapper[4749]: I1001 14:30:02.294849 4749 generic.go:334] "Generic (PLEG): container finished" podID="463a37f2-e2a2-4385-98bb-d00549dfd5a2" containerID="14a8c49c9931630c3deea9eeb8d463499a2b06130e52aa2252610fdf4ce36662" exitCode=0 Oct 01 14:30:02 crc kubenswrapper[4749]: I1001 14:30:02.294983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" event={"ID":"463a37f2-e2a2-4385-98bb-d00549dfd5a2","Type":"ContainerDied","Data":"14a8c49c9931630c3deea9eeb8d463499a2b06130e52aa2252610fdf4ce36662"} Oct 01 14:30:03 crc kubenswrapper[4749]: I1001 14:30:03.647344 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:03 crc kubenswrapper[4749]: I1001 14:30:03.740974 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/463a37f2-e2a2-4385-98bb-d00549dfd5a2-secret-volume\") pod \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\" (UID: \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\") " Oct 01 14:30:03 crc kubenswrapper[4749]: I1001 14:30:03.741049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcqqt\" (UniqueName: \"kubernetes.io/projected/463a37f2-e2a2-4385-98bb-d00549dfd5a2-kube-api-access-gcqqt\") pod \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\" (UID: \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\") " Oct 01 14:30:03 crc kubenswrapper[4749]: I1001 14:30:03.741297 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/463a37f2-e2a2-4385-98bb-d00549dfd5a2-config-volume\") pod \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\" (UID: \"463a37f2-e2a2-4385-98bb-d00549dfd5a2\") " Oct 01 14:30:03 crc kubenswrapper[4749]: I1001 14:30:03.741683 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/463a37f2-e2a2-4385-98bb-d00549dfd5a2-config-volume" (OuterVolumeSpecName: "config-volume") pod "463a37f2-e2a2-4385-98bb-d00549dfd5a2" (UID: "463a37f2-e2a2-4385-98bb-d00549dfd5a2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:30:03 crc kubenswrapper[4749]: I1001 14:30:03.741904 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/463a37f2-e2a2-4385-98bb-d00549dfd5a2-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:30:03 crc kubenswrapper[4749]: I1001 14:30:03.745669 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463a37f2-e2a2-4385-98bb-d00549dfd5a2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "463a37f2-e2a2-4385-98bb-d00549dfd5a2" (UID: "463a37f2-e2a2-4385-98bb-d00549dfd5a2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:30:03 crc kubenswrapper[4749]: I1001 14:30:03.746100 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463a37f2-e2a2-4385-98bb-d00549dfd5a2-kube-api-access-gcqqt" (OuterVolumeSpecName: "kube-api-access-gcqqt") pod "463a37f2-e2a2-4385-98bb-d00549dfd5a2" (UID: "463a37f2-e2a2-4385-98bb-d00549dfd5a2"). InnerVolumeSpecName "kube-api-access-gcqqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:30:03 crc kubenswrapper[4749]: I1001 14:30:03.843908 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcqqt\" (UniqueName: \"kubernetes.io/projected/463a37f2-e2a2-4385-98bb-d00549dfd5a2-kube-api-access-gcqqt\") on node \"crc\" DevicePath \"\"" Oct 01 14:30:03 crc kubenswrapper[4749]: I1001 14:30:03.844296 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/463a37f2-e2a2-4385-98bb-d00549dfd5a2-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:30:04 crc kubenswrapper[4749]: I1001 14:30:04.315355 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" event={"ID":"463a37f2-e2a2-4385-98bb-d00549dfd5a2","Type":"ContainerDied","Data":"88f10a27c49a343edbbc9fb9295c81dce8e8a591d430e7babea44d198111a361"} Oct 01 14:30:04 crc kubenswrapper[4749]: I1001 14:30:04.315400 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88f10a27c49a343edbbc9fb9295c81dce8e8a591d430e7babea44d198111a361" Oct 01 14:30:04 crc kubenswrapper[4749]: I1001 14:30:04.315418 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322150-x8fv6" Oct 01 14:30:04 crc kubenswrapper[4749]: I1001 14:30:04.731754 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx"] Oct 01 14:30:04 crc kubenswrapper[4749]: I1001 14:30:04.756953 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322105-mddtx"] Oct 01 14:30:05 crc kubenswrapper[4749]: I1001 14:30:05.244430 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a896be31-cf94-4d34-828a-c386800fb02c" path="/var/lib/kubelet/pods/a896be31-cf94-4d34-828a-c386800fb02c/volumes" Oct 01 14:30:11 crc kubenswrapper[4749]: I1001 14:30:11.081670 4749 scope.go:117] "RemoveContainer" containerID="e1b0221febadd199a13554849d73c3da37b3fd0b6b3c53cb226b7ff4683b2036" Oct 01 14:30:14 crc kubenswrapper[4749]: I1001 14:30:14.229858 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:30:14 crc kubenswrapper[4749]: E1001 14:30:14.230703 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:30:25 crc kubenswrapper[4749]: I1001 14:30:25.230791 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:30:25 crc kubenswrapper[4749]: E1001 14:30:25.232058 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:30:39 crc kubenswrapper[4749]: I1001 14:30:39.232443 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:30:39 crc kubenswrapper[4749]: E1001 14:30:39.233293 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:30:53 crc kubenswrapper[4749]: I1001 14:30:53.230697 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:30:53 crc kubenswrapper[4749]: E1001 14:30:53.232490 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:31:07 crc kubenswrapper[4749]: I1001 14:31:07.230684 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:31:07 crc kubenswrapper[4749]: E1001 14:31:07.231632 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:31:18 crc kubenswrapper[4749]: I1001 14:31:18.230275 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:31:18 crc kubenswrapper[4749]: E1001 14:31:18.231077 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:31:29 crc kubenswrapper[4749]: I1001 14:31:29.230107 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:31:29 crc kubenswrapper[4749]: E1001 14:31:29.231392 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:31:41 crc kubenswrapper[4749]: I1001 14:31:41.241588 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:31:41 crc kubenswrapper[4749]: E1001 14:31:41.243206 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.015762 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-htd5l"] Oct 01 14:31:51 crc kubenswrapper[4749]: E1001 14:31:51.017471 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463a37f2-e2a2-4385-98bb-d00549dfd5a2" containerName="collect-profiles" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.017506 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="463a37f2-e2a2-4385-98bb-d00549dfd5a2" containerName="collect-profiles" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.018064 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="463a37f2-e2a2-4385-98bb-d00549dfd5a2" containerName="collect-profiles" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.022078 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.031185 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-htd5l"] Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.113054 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a522de-d45f-4eaf-ac27-502302c4f0f7-utilities\") pod \"redhat-marketplace-htd5l\" (UID: \"82a522de-d45f-4eaf-ac27-502302c4f0f7\") " pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.113267 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a522de-d45f-4eaf-ac27-502302c4f0f7-catalog-content\") pod \"redhat-marketplace-htd5l\" (UID: \"82a522de-d45f-4eaf-ac27-502302c4f0f7\") " pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.113590 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc6b7\" (UniqueName: \"kubernetes.io/projected/82a522de-d45f-4eaf-ac27-502302c4f0f7-kube-api-access-lc6b7\") pod \"redhat-marketplace-htd5l\" (UID: \"82a522de-d45f-4eaf-ac27-502302c4f0f7\") " pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.215065 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc6b7\" (UniqueName: \"kubernetes.io/projected/82a522de-d45f-4eaf-ac27-502302c4f0f7-kube-api-access-lc6b7\") pod \"redhat-marketplace-htd5l\" (UID: \"82a522de-d45f-4eaf-ac27-502302c4f0f7\") " pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.215178 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a522de-d45f-4eaf-ac27-502302c4f0f7-utilities\") pod \"redhat-marketplace-htd5l\" (UID: \"82a522de-d45f-4eaf-ac27-502302c4f0f7\") " pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.215283 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a522de-d45f-4eaf-ac27-502302c4f0f7-catalog-content\") pod \"redhat-marketplace-htd5l\" (UID: \"82a522de-d45f-4eaf-ac27-502302c4f0f7\") " pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.215930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a522de-d45f-4eaf-ac27-502302c4f0f7-utilities\") pod \"redhat-marketplace-htd5l\" (UID: \"82a522de-d45f-4eaf-ac27-502302c4f0f7\") " pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.215973 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a522de-d45f-4eaf-ac27-502302c4f0f7-catalog-content\") pod \"redhat-marketplace-htd5l\" (UID: \"82a522de-d45f-4eaf-ac27-502302c4f0f7\") " pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.235653 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc6b7\" (UniqueName: \"kubernetes.io/projected/82a522de-d45f-4eaf-ac27-502302c4f0f7-kube-api-access-lc6b7\") pod \"redhat-marketplace-htd5l\" (UID: \"82a522de-d45f-4eaf-ac27-502302c4f0f7\") " pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.351472 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:31:51 crc kubenswrapper[4749]: I1001 14:31:51.830808 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-htd5l"] Oct 01 14:31:52 crc kubenswrapper[4749]: I1001 14:31:52.442863 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htd5l" event={"ID":"82a522de-d45f-4eaf-ac27-502302c4f0f7","Type":"ContainerStarted","Data":"3f865284089b13e62f548af7f3a98ff821f26d8b2cdb3bf8013ced7970920b88"} Oct 01 14:31:53 crc kubenswrapper[4749]: I1001 14:31:53.460150 4749 generic.go:334] "Generic (PLEG): container finished" podID="82a522de-d45f-4eaf-ac27-502302c4f0f7" containerID="6b44e4fccf5d1ea21f250df8d0c7d9e9eeb2056e58f9d1fda8df4a18a79c32b8" exitCode=0 Oct 01 14:31:53 crc kubenswrapper[4749]: I1001 14:31:53.460277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htd5l" event={"ID":"82a522de-d45f-4eaf-ac27-502302c4f0f7","Type":"ContainerDied","Data":"6b44e4fccf5d1ea21f250df8d0c7d9e9eeb2056e58f9d1fda8df4a18a79c32b8"} Oct 01 14:31:53 crc kubenswrapper[4749]: I1001 14:31:53.464929 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:31:55 crc kubenswrapper[4749]: I1001 14:31:55.230544 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:31:55 crc kubenswrapper[4749]: E1001 14:31:55.231118 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:31:55 crc kubenswrapper[4749]: I1001 14:31:55.484681 4749 generic.go:334] "Generic (PLEG): container finished" podID="82a522de-d45f-4eaf-ac27-502302c4f0f7" containerID="d18463155d5fc91461ee2650d3e819194b75a7d45aca01181c25193409478e83" exitCode=0 Oct 01 14:31:55 crc kubenswrapper[4749]: I1001 14:31:55.484719 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htd5l" event={"ID":"82a522de-d45f-4eaf-ac27-502302c4f0f7","Type":"ContainerDied","Data":"d18463155d5fc91461ee2650d3e819194b75a7d45aca01181c25193409478e83"} Oct 01 14:31:56 crc kubenswrapper[4749]: I1001 14:31:56.499148 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htd5l" event={"ID":"82a522de-d45f-4eaf-ac27-502302c4f0f7","Type":"ContainerStarted","Data":"62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43"} Oct 01 14:31:56 crc kubenswrapper[4749]: I1001 14:31:56.530624 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-htd5l" podStartSLOduration=4.113453002 podStartE2EDuration="6.530597884s" podCreationTimestamp="2025-10-01 14:31:50 +0000 UTC" firstStartedPulling="2025-10-01 14:31:53.464528592 +0000 UTC m=+5173.518513531" lastFinishedPulling="2025-10-01 14:31:55.881673504 +0000 UTC m=+5175.935658413" observedRunningTime="2025-10-01 14:31:56.515436616 +0000 UTC m=+5176.569421575" watchObservedRunningTime="2025-10-01 14:31:56.530597884 +0000 UTC m=+5176.584582793" Oct 01 14:32:01 crc kubenswrapper[4749]: I1001 14:32:01.352131 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:32:01 crc kubenswrapper[4749]: I1001 14:32:01.352803 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:32:01 crc kubenswrapper[4749]: I1001 14:32:01.398456 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:32:01 crc kubenswrapper[4749]: I1001 14:32:01.618898 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:32:01 crc kubenswrapper[4749]: I1001 14:32:01.678133 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-htd5l"] Oct 01 14:32:03 crc kubenswrapper[4749]: I1001 14:32:03.566395 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-htd5l" podUID="82a522de-d45f-4eaf-ac27-502302c4f0f7" containerName="registry-server" containerID="cri-o://62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43" gracePeriod=2 Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.018639 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.123012 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc6b7\" (UniqueName: \"kubernetes.io/projected/82a522de-d45f-4eaf-ac27-502302c4f0f7-kube-api-access-lc6b7\") pod \"82a522de-d45f-4eaf-ac27-502302c4f0f7\" (UID: \"82a522de-d45f-4eaf-ac27-502302c4f0f7\") " Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.123105 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a522de-d45f-4eaf-ac27-502302c4f0f7-catalog-content\") pod \"82a522de-d45f-4eaf-ac27-502302c4f0f7\" (UID: \"82a522de-d45f-4eaf-ac27-502302c4f0f7\") " Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.123291 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a522de-d45f-4eaf-ac27-502302c4f0f7-utilities\") pod \"82a522de-d45f-4eaf-ac27-502302c4f0f7\" (UID: \"82a522de-d45f-4eaf-ac27-502302c4f0f7\") " Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.124021 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82a522de-d45f-4eaf-ac27-502302c4f0f7-utilities" (OuterVolumeSpecName: "utilities") pod "82a522de-d45f-4eaf-ac27-502302c4f0f7" (UID: "82a522de-d45f-4eaf-ac27-502302c4f0f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.132419 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a522de-d45f-4eaf-ac27-502302c4f0f7-kube-api-access-lc6b7" (OuterVolumeSpecName: "kube-api-access-lc6b7") pod "82a522de-d45f-4eaf-ac27-502302c4f0f7" (UID: "82a522de-d45f-4eaf-ac27-502302c4f0f7"). InnerVolumeSpecName "kube-api-access-lc6b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.139030 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82a522de-d45f-4eaf-ac27-502302c4f0f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82a522de-d45f-4eaf-ac27-502302c4f0f7" (UID: "82a522de-d45f-4eaf-ac27-502302c4f0f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.225727 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc6b7\" (UniqueName: \"kubernetes.io/projected/82a522de-d45f-4eaf-ac27-502302c4f0f7-kube-api-access-lc6b7\") on node \"crc\" DevicePath \"\"" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.225762 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a522de-d45f-4eaf-ac27-502302c4f0f7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.225775 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a522de-d45f-4eaf-ac27-502302c4f0f7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.580788 4749 generic.go:334] "Generic (PLEG): container finished" podID="82a522de-d45f-4eaf-ac27-502302c4f0f7" containerID="62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43" exitCode=0 Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.580879 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htd5l" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.580865 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htd5l" event={"ID":"82a522de-d45f-4eaf-ac27-502302c4f0f7","Type":"ContainerDied","Data":"62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43"} Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.581316 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htd5l" event={"ID":"82a522de-d45f-4eaf-ac27-502302c4f0f7","Type":"ContainerDied","Data":"3f865284089b13e62f548af7f3a98ff821f26d8b2cdb3bf8013ced7970920b88"} Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.581350 4749 scope.go:117] "RemoveContainer" containerID="62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.603500 4749 scope.go:117] "RemoveContainer" containerID="d18463155d5fc91461ee2650d3e819194b75a7d45aca01181c25193409478e83" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.626023 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-htd5l"] Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.636124 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-htd5l"] Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.648173 4749 scope.go:117] "RemoveContainer" containerID="6b44e4fccf5d1ea21f250df8d0c7d9e9eeb2056e58f9d1fda8df4a18a79c32b8" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.705496 4749 scope.go:117] "RemoveContainer" containerID="62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43" Oct 01 14:32:04 crc kubenswrapper[4749]: E1001 14:32:04.706437 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43\": container with ID starting with 62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43 not found: ID does not exist" containerID="62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.706514 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43"} err="failed to get container status \"62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43\": rpc error: code = NotFound desc = could not find container \"62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43\": container with ID starting with 62e21ec2b0ca692131b51f23250775afd54472e2328b716ccc0bbd82fc145d43 not found: ID does not exist" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.706542 4749 scope.go:117] "RemoveContainer" containerID="d18463155d5fc91461ee2650d3e819194b75a7d45aca01181c25193409478e83" Oct 01 14:32:04 crc kubenswrapper[4749]: E1001 14:32:04.706936 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18463155d5fc91461ee2650d3e819194b75a7d45aca01181c25193409478e83\": container with ID starting with d18463155d5fc91461ee2650d3e819194b75a7d45aca01181c25193409478e83 not found: ID does not exist" containerID="d18463155d5fc91461ee2650d3e819194b75a7d45aca01181c25193409478e83" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.706956 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18463155d5fc91461ee2650d3e819194b75a7d45aca01181c25193409478e83"} err="failed to get container status \"d18463155d5fc91461ee2650d3e819194b75a7d45aca01181c25193409478e83\": rpc error: code = NotFound desc = could not find container \"d18463155d5fc91461ee2650d3e819194b75a7d45aca01181c25193409478e83\": container with ID starting with d18463155d5fc91461ee2650d3e819194b75a7d45aca01181c25193409478e83 not found: ID does not exist" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.706969 4749 scope.go:117] "RemoveContainer" containerID="6b44e4fccf5d1ea21f250df8d0c7d9e9eeb2056e58f9d1fda8df4a18a79c32b8" Oct 01 14:32:04 crc kubenswrapper[4749]: E1001 14:32:04.707513 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b44e4fccf5d1ea21f250df8d0c7d9e9eeb2056e58f9d1fda8df4a18a79c32b8\": container with ID starting with 6b44e4fccf5d1ea21f250df8d0c7d9e9eeb2056e58f9d1fda8df4a18a79c32b8 not found: ID does not exist" containerID="6b44e4fccf5d1ea21f250df8d0c7d9e9eeb2056e58f9d1fda8df4a18a79c32b8" Oct 01 14:32:04 crc kubenswrapper[4749]: I1001 14:32:04.707580 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b44e4fccf5d1ea21f250df8d0c7d9e9eeb2056e58f9d1fda8df4a18a79c32b8"} err="failed to get container status \"6b44e4fccf5d1ea21f250df8d0c7d9e9eeb2056e58f9d1fda8df4a18a79c32b8\": rpc error: code = NotFound desc = could not find container \"6b44e4fccf5d1ea21f250df8d0c7d9e9eeb2056e58f9d1fda8df4a18a79c32b8\": container with ID starting with 6b44e4fccf5d1ea21f250df8d0c7d9e9eeb2056e58f9d1fda8df4a18a79c32b8 not found: ID does not exist" Oct 01 14:32:05 crc kubenswrapper[4749]: I1001 14:32:05.242016 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a522de-d45f-4eaf-ac27-502302c4f0f7" path="/var/lib/kubelet/pods/82a522de-d45f-4eaf-ac27-502302c4f0f7/volumes" Oct 01 14:32:09 crc kubenswrapper[4749]: I1001 14:32:09.230779 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:32:09 crc kubenswrapper[4749]: E1001 14:32:09.232530 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.673488 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gtlr9"] Oct 01 14:32:18 crc kubenswrapper[4749]: E1001 14:32:18.674829 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a522de-d45f-4eaf-ac27-502302c4f0f7" containerName="registry-server" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.674847 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a522de-d45f-4eaf-ac27-502302c4f0f7" containerName="registry-server" Oct 01 14:32:18 crc kubenswrapper[4749]: E1001 14:32:18.674878 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a522de-d45f-4eaf-ac27-502302c4f0f7" containerName="extract-utilities" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.674886 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a522de-d45f-4eaf-ac27-502302c4f0f7" containerName="extract-utilities" Oct 01 14:32:18 crc kubenswrapper[4749]: E1001 14:32:18.674903 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a522de-d45f-4eaf-ac27-502302c4f0f7" containerName="extract-content" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.674911 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a522de-d45f-4eaf-ac27-502302c4f0f7" containerName="extract-content" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.675139 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a522de-d45f-4eaf-ac27-502302c4f0f7" containerName="registry-server" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.676837 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.684608 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gtlr9"] Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.737837 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5927c768-10f5-4175-b09f-486084973982-catalog-content\") pod \"redhat-operators-gtlr9\" (UID: \"5927c768-10f5-4175-b09f-486084973982\") " pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.738360 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5927c768-10f5-4175-b09f-486084973982-utilities\") pod \"redhat-operators-gtlr9\" (UID: \"5927c768-10f5-4175-b09f-486084973982\") " pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.738437 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtpb8\" (UniqueName: \"kubernetes.io/projected/5927c768-10f5-4175-b09f-486084973982-kube-api-access-gtpb8\") pod \"redhat-operators-gtlr9\" (UID: \"5927c768-10f5-4175-b09f-486084973982\") " pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.840283 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5927c768-10f5-4175-b09f-486084973982-utilities\") pod \"redhat-operators-gtlr9\" (UID: \"5927c768-10f5-4175-b09f-486084973982\") " pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.840366 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtpb8\" (UniqueName: \"kubernetes.io/projected/5927c768-10f5-4175-b09f-486084973982-kube-api-access-gtpb8\") pod \"redhat-operators-gtlr9\" (UID: \"5927c768-10f5-4175-b09f-486084973982\") " pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.840467 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5927c768-10f5-4175-b09f-486084973982-catalog-content\") pod \"redhat-operators-gtlr9\" (UID: \"5927c768-10f5-4175-b09f-486084973982\") " pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.840905 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5927c768-10f5-4175-b09f-486084973982-utilities\") pod \"redhat-operators-gtlr9\" (UID: \"5927c768-10f5-4175-b09f-486084973982\") " pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.840930 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5927c768-10f5-4175-b09f-486084973982-catalog-content\") pod \"redhat-operators-gtlr9\" (UID: \"5927c768-10f5-4175-b09f-486084973982\") " pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:18 crc kubenswrapper[4749]: I1001 14:32:18.863489 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtpb8\" (UniqueName: \"kubernetes.io/projected/5927c768-10f5-4175-b09f-486084973982-kube-api-access-gtpb8\") pod \"redhat-operators-gtlr9\" (UID: \"5927c768-10f5-4175-b09f-486084973982\") " pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:19 crc kubenswrapper[4749]: I1001 14:32:19.017303 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:19 crc kubenswrapper[4749]: I1001 14:32:19.503576 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gtlr9"] Oct 01 14:32:19 crc kubenswrapper[4749]: I1001 14:32:19.747540 4749 generic.go:334] "Generic (PLEG): container finished" podID="5927c768-10f5-4175-b09f-486084973982" containerID="e1f5f39420afc366a4a4b9a26bb3b4cf7acf3583773c348df478bb55f3df13ad" exitCode=0 Oct 01 14:32:19 crc kubenswrapper[4749]: I1001 14:32:19.747644 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtlr9" event={"ID":"5927c768-10f5-4175-b09f-486084973982","Type":"ContainerDied","Data":"e1f5f39420afc366a4a4b9a26bb3b4cf7acf3583773c348df478bb55f3df13ad"} Oct 01 14:32:19 crc kubenswrapper[4749]: I1001 14:32:19.747837 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtlr9" event={"ID":"5927c768-10f5-4175-b09f-486084973982","Type":"ContainerStarted","Data":"576f83ac47de70b6c2bad8b5fe31fd95b9e70772809fe2e8a4ece5971c36eed1"} Oct 01 14:32:20 crc kubenswrapper[4749]: I1001 14:32:20.760400 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtlr9" event={"ID":"5927c768-10f5-4175-b09f-486084973982","Type":"ContainerStarted","Data":"e9d6c711d8744581c14f9ed27ee1a1b44e74962fa53b551cc53bb9669a00c304"} Oct 01 14:32:21 crc kubenswrapper[4749]: I1001 14:32:21.235892 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:32:21 crc kubenswrapper[4749]: E1001 14:32:21.236207 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:32:21 crc kubenswrapper[4749]: I1001 14:32:21.772899 4749 generic.go:334] "Generic (PLEG): container finished" podID="5927c768-10f5-4175-b09f-486084973982" containerID="e9d6c711d8744581c14f9ed27ee1a1b44e74962fa53b551cc53bb9669a00c304" exitCode=0 Oct 01 14:32:21 crc kubenswrapper[4749]: I1001 14:32:21.773170 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtlr9" event={"ID":"5927c768-10f5-4175-b09f-486084973982","Type":"ContainerDied","Data":"e9d6c711d8744581c14f9ed27ee1a1b44e74962fa53b551cc53bb9669a00c304"} Oct 01 14:32:23 crc kubenswrapper[4749]: I1001 14:32:23.798065 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtlr9" event={"ID":"5927c768-10f5-4175-b09f-486084973982","Type":"ContainerStarted","Data":"6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd"} Oct 01 14:32:23 crc kubenswrapper[4749]: I1001 14:32:23.829309 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gtlr9" podStartSLOduration=3.209821947 podStartE2EDuration="5.82929018s" podCreationTimestamp="2025-10-01 14:32:18 +0000 UTC" firstStartedPulling="2025-10-01 14:32:19.749001245 +0000 UTC m=+5199.802986134" lastFinishedPulling="2025-10-01 14:32:22.368469428 +0000 UTC m=+5202.422454367" observedRunningTime="2025-10-01 14:32:23.825281065 +0000 UTC m=+5203.879265954" watchObservedRunningTime="2025-10-01 14:32:23.82929018 +0000 UTC m=+5203.883275089" Oct 01 14:32:29 crc kubenswrapper[4749]: I1001 14:32:29.017559 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:29 crc kubenswrapper[4749]: I1001 14:32:29.019321 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:29 crc kubenswrapper[4749]: I1001 14:32:29.070394 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:29 crc kubenswrapper[4749]: I1001 14:32:29.902541 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:29 crc kubenswrapper[4749]: I1001 14:32:29.958981 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gtlr9"] Oct 01 14:32:31 crc kubenswrapper[4749]: I1001 14:32:31.887022 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gtlr9" podUID="5927c768-10f5-4175-b09f-486084973982" containerName="registry-server" containerID="cri-o://6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd" gracePeriod=2 Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.371411 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.434905 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5927c768-10f5-4175-b09f-486084973982-catalog-content\") pod \"5927c768-10f5-4175-b09f-486084973982\" (UID: \"5927c768-10f5-4175-b09f-486084973982\") " Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.435050 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtpb8\" (UniqueName: \"kubernetes.io/projected/5927c768-10f5-4175-b09f-486084973982-kube-api-access-gtpb8\") pod \"5927c768-10f5-4175-b09f-486084973982\" (UID: \"5927c768-10f5-4175-b09f-486084973982\") " Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.436480 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5927c768-10f5-4175-b09f-486084973982-utilities\") pod \"5927c768-10f5-4175-b09f-486084973982\" (UID: \"5927c768-10f5-4175-b09f-486084973982\") " Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.437093 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5927c768-10f5-4175-b09f-486084973982-utilities" (OuterVolumeSpecName: "utilities") pod "5927c768-10f5-4175-b09f-486084973982" (UID: "5927c768-10f5-4175-b09f-486084973982"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.437643 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5927c768-10f5-4175-b09f-486084973982-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.440909 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5927c768-10f5-4175-b09f-486084973982-kube-api-access-gtpb8" (OuterVolumeSpecName: "kube-api-access-gtpb8") pod "5927c768-10f5-4175-b09f-486084973982" (UID: "5927c768-10f5-4175-b09f-486084973982"). InnerVolumeSpecName "kube-api-access-gtpb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.521611 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5927c768-10f5-4175-b09f-486084973982-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5927c768-10f5-4175-b09f-486084973982" (UID: "5927c768-10f5-4175-b09f-486084973982"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.539952 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtpb8\" (UniqueName: \"kubernetes.io/projected/5927c768-10f5-4175-b09f-486084973982-kube-api-access-gtpb8\") on node \"crc\" DevicePath \"\"" Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.540006 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5927c768-10f5-4175-b09f-486084973982-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.903607 4749 generic.go:334] "Generic (PLEG): container finished" podID="5927c768-10f5-4175-b09f-486084973982" containerID="6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd" exitCode=0 Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.903742 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gtlr9" Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.903788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtlr9" event={"ID":"5927c768-10f5-4175-b09f-486084973982","Type":"ContainerDied","Data":"6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd"} Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.904278 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gtlr9" event={"ID":"5927c768-10f5-4175-b09f-486084973982","Type":"ContainerDied","Data":"576f83ac47de70b6c2bad8b5fe31fd95b9e70772809fe2e8a4ece5971c36eed1"} Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.904326 4749 scope.go:117] "RemoveContainer" containerID="6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd" Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.946925 4749 scope.go:117] "RemoveContainer" containerID="e9d6c711d8744581c14f9ed27ee1a1b44e74962fa53b551cc53bb9669a00c304" Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.952728 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gtlr9"] Oct 01 14:32:32 crc kubenswrapper[4749]: I1001 14:32:32.963760 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gtlr9"] Oct 01 14:32:33 crc kubenswrapper[4749]: I1001 14:32:33.151307 4749 scope.go:117] "RemoveContainer" containerID="e1f5f39420afc366a4a4b9a26bb3b4cf7acf3583773c348df478bb55f3df13ad" Oct 01 14:32:33 crc kubenswrapper[4749]: I1001 14:32:33.220733 4749 scope.go:117] "RemoveContainer" containerID="6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd" Oct 01 14:32:33 crc kubenswrapper[4749]: E1001 14:32:33.221280 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd\": container with ID starting with 6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd not found: ID does not exist" containerID="6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd" Oct 01 14:32:33 crc kubenswrapper[4749]: I1001 14:32:33.221326 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd"} err="failed to get container status \"6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd\": rpc error: code = NotFound desc = could not find container \"6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd\": container with ID starting with 6545f848dbbf3428bfc32c3218e19ea4bac86888a8af606e501edf70e614ebfd not found: ID does not exist" Oct 01 14:32:33 crc kubenswrapper[4749]: I1001 14:32:33.221352 4749 scope.go:117] "RemoveContainer" containerID="e9d6c711d8744581c14f9ed27ee1a1b44e74962fa53b551cc53bb9669a00c304" Oct 01 14:32:33 crc kubenswrapper[4749]: E1001 14:32:33.221714 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d6c711d8744581c14f9ed27ee1a1b44e74962fa53b551cc53bb9669a00c304\": container with ID starting with e9d6c711d8744581c14f9ed27ee1a1b44e74962fa53b551cc53bb9669a00c304 not found: ID does not exist" containerID="e9d6c711d8744581c14f9ed27ee1a1b44e74962fa53b551cc53bb9669a00c304" Oct 01 14:32:33 crc kubenswrapper[4749]: I1001 14:32:33.221737 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d6c711d8744581c14f9ed27ee1a1b44e74962fa53b551cc53bb9669a00c304"} err="failed to get container status \"e9d6c711d8744581c14f9ed27ee1a1b44e74962fa53b551cc53bb9669a00c304\": rpc error: code = NotFound desc = could not find container \"e9d6c711d8744581c14f9ed27ee1a1b44e74962fa53b551cc53bb9669a00c304\": container with ID starting with e9d6c711d8744581c14f9ed27ee1a1b44e74962fa53b551cc53bb9669a00c304 not found: ID does not exist" Oct 01 14:32:33 crc kubenswrapper[4749]: I1001 14:32:33.221752 4749 scope.go:117] "RemoveContainer" containerID="e1f5f39420afc366a4a4b9a26bb3b4cf7acf3583773c348df478bb55f3df13ad" Oct 01 14:32:33 crc kubenswrapper[4749]: E1001 14:32:33.222095 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f5f39420afc366a4a4b9a26bb3b4cf7acf3583773c348df478bb55f3df13ad\": container with ID starting with e1f5f39420afc366a4a4b9a26bb3b4cf7acf3583773c348df478bb55f3df13ad not found: ID does not exist" containerID="e1f5f39420afc366a4a4b9a26bb3b4cf7acf3583773c348df478bb55f3df13ad" Oct 01 14:32:33 crc kubenswrapper[4749]: I1001 14:32:33.222165 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f5f39420afc366a4a4b9a26bb3b4cf7acf3583773c348df478bb55f3df13ad"} err="failed to get container status \"e1f5f39420afc366a4a4b9a26bb3b4cf7acf3583773c348df478bb55f3df13ad\": rpc error: code = NotFound desc = could not find container \"e1f5f39420afc366a4a4b9a26bb3b4cf7acf3583773c348df478bb55f3df13ad\": container with ID starting with e1f5f39420afc366a4a4b9a26bb3b4cf7acf3583773c348df478bb55f3df13ad not found: ID does not exist" Oct 01 14:32:33 crc kubenswrapper[4749]: I1001 14:32:33.243796 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5927c768-10f5-4175-b09f-486084973982" path="/var/lib/kubelet/pods/5927c768-10f5-4175-b09f-486084973982/volumes" Oct 01 14:32:34 crc kubenswrapper[4749]: I1001 14:32:34.230169 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:32:34 crc kubenswrapper[4749]: E1001 14:32:34.230889 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:32:47 crc kubenswrapper[4749]: I1001 14:32:47.229773 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:32:47 crc kubenswrapper[4749]: E1001 14:32:47.230483 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:33:00 crc kubenswrapper[4749]: I1001 14:33:00.230563 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:33:00 crc kubenswrapper[4749]: E1001 14:33:00.231203 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.047528 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jrmkr"] Oct 01 14:33:12 crc kubenswrapper[4749]: E1001 14:33:12.048432 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5927c768-10f5-4175-b09f-486084973982" containerName="extract-utilities" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.048450 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5927c768-10f5-4175-b09f-486084973982" containerName="extract-utilities" Oct 01 14:33:12 crc kubenswrapper[4749]: E1001 14:33:12.048469 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5927c768-10f5-4175-b09f-486084973982" containerName="registry-server" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.048477 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5927c768-10f5-4175-b09f-486084973982" containerName="registry-server" Oct 01 14:33:12 crc kubenswrapper[4749]: E1001 14:33:12.048512 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5927c768-10f5-4175-b09f-486084973982" containerName="extract-content" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.048518 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5927c768-10f5-4175-b09f-486084973982" containerName="extract-content" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.048724 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5927c768-10f5-4175-b09f-486084973982" containerName="registry-server" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.050338 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.064007 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jrmkr"] Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.189793 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-utilities\") pod \"community-operators-jrmkr\" (UID: \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\") " pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.189928 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qjq8\" (UniqueName: \"kubernetes.io/projected/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-kube-api-access-9qjq8\") pod \"community-operators-jrmkr\" (UID: \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\") " pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.189990 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-catalog-content\") pod \"community-operators-jrmkr\" (UID: \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\") " pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.231406 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:33:12 crc kubenswrapper[4749]: E1001 14:33:12.231610 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.292469 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-utilities\") pod \"community-operators-jrmkr\" (UID: \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\") " pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.292521 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qjq8\" (UniqueName: \"kubernetes.io/projected/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-kube-api-access-9qjq8\") pod \"community-operators-jrmkr\" (UID: \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\") " pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.292538 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-catalog-content\") pod \"community-operators-jrmkr\" (UID: \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\") " pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.293369 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-catalog-content\") pod \"community-operators-jrmkr\" (UID: \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\") " pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.293440 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-utilities\") pod \"community-operators-jrmkr\" (UID: \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\") " pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.311354 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qjq8\" (UniqueName: \"kubernetes.io/projected/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-kube-api-access-9qjq8\") pod \"community-operators-jrmkr\" (UID: \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\") " pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.506424 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:12 crc kubenswrapper[4749]: I1001 14:33:12.977002 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jrmkr"] Oct 01 14:33:12 crc kubenswrapper[4749]: W1001 14:33:12.981682 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1224aaad_927b_4ead_91b4_9fa65f7ea6c5.slice/crio-971a8edfce91bdf0b5ec3580b483befa0f144b515d7896ad87a7d2a88362c93e WatchSource:0}: Error finding container 971a8edfce91bdf0b5ec3580b483befa0f144b515d7896ad87a7d2a88362c93e: Status 404 returned error can't find the container with id 971a8edfce91bdf0b5ec3580b483befa0f144b515d7896ad87a7d2a88362c93e Oct 01 14:33:13 crc kubenswrapper[4749]: I1001 14:33:13.295527 4749 generic.go:334] "Generic (PLEG): container finished" podID="1224aaad-927b-4ead-91b4-9fa65f7ea6c5" containerID="a99024b297a4f5f1282efb20d15af850ecb677b90824ede0d4f8071a77d91749" exitCode=0 Oct 01 14:33:13 crc kubenswrapper[4749]: I1001 14:33:13.295578 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrmkr" event={"ID":"1224aaad-927b-4ead-91b4-9fa65f7ea6c5","Type":"ContainerDied","Data":"a99024b297a4f5f1282efb20d15af850ecb677b90824ede0d4f8071a77d91749"} Oct 01 14:33:13 crc kubenswrapper[4749]: I1001 14:33:13.295811 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrmkr" event={"ID":"1224aaad-927b-4ead-91b4-9fa65f7ea6c5","Type":"ContainerStarted","Data":"971a8edfce91bdf0b5ec3580b483befa0f144b515d7896ad87a7d2a88362c93e"} Oct 01 14:33:14 crc kubenswrapper[4749]: I1001 14:33:14.307715 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrmkr" event={"ID":"1224aaad-927b-4ead-91b4-9fa65f7ea6c5","Type":"ContainerStarted","Data":"cfa9d3f2d76b187bc7d695eab59039c026838d8ae540d452610eb62d38ff7cd0"} Oct 01 14:33:15 crc kubenswrapper[4749]: I1001 14:33:15.316385 4749 generic.go:334] "Generic (PLEG): container finished" podID="1224aaad-927b-4ead-91b4-9fa65f7ea6c5" containerID="cfa9d3f2d76b187bc7d695eab59039c026838d8ae540d452610eb62d38ff7cd0" exitCode=0 Oct 01 14:33:15 crc kubenswrapper[4749]: I1001 14:33:15.316433 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrmkr" event={"ID":"1224aaad-927b-4ead-91b4-9fa65f7ea6c5","Type":"ContainerDied","Data":"cfa9d3f2d76b187bc7d695eab59039c026838d8ae540d452610eb62d38ff7cd0"} Oct 01 14:33:16 crc kubenswrapper[4749]: I1001 14:33:16.326138 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrmkr" event={"ID":"1224aaad-927b-4ead-91b4-9fa65f7ea6c5","Type":"ContainerStarted","Data":"4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693"} Oct 01 14:33:16 crc kubenswrapper[4749]: I1001 14:33:16.347384 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jrmkr" podStartSLOduration=1.739172551 podStartE2EDuration="4.347364128s" podCreationTimestamp="2025-10-01 14:33:12 +0000 UTC" firstStartedPulling="2025-10-01 14:33:13.297473863 +0000 UTC m=+5253.351458762" lastFinishedPulling="2025-10-01 14:33:15.90566542 +0000 UTC m=+5255.959650339" observedRunningTime="2025-10-01 14:33:16.343066624 +0000 UTC m=+5256.397051523" watchObservedRunningTime="2025-10-01 14:33:16.347364128 +0000 UTC m=+5256.401349027" Oct 01 14:33:22 crc kubenswrapper[4749]: I1001 14:33:22.506761 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:22 crc kubenswrapper[4749]: I1001 14:33:22.507373 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:22 crc kubenswrapper[4749]: I1001 14:33:22.551175 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:23 crc kubenswrapper[4749]: I1001 14:33:23.481014 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:23 crc kubenswrapper[4749]: I1001 14:33:23.543758 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jrmkr"] Oct 01 14:33:25 crc kubenswrapper[4749]: I1001 14:33:25.438207 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jrmkr" podUID="1224aaad-927b-4ead-91b4-9fa65f7ea6c5" containerName="registry-server" containerID="cri-o://4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693" gracePeriod=2 Oct 01 14:33:25 crc kubenswrapper[4749]: I1001 14:33:25.941296 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.101064 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qjq8\" (UniqueName: \"kubernetes.io/projected/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-kube-api-access-9qjq8\") pod \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\" (UID: \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\") " Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.101195 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-catalog-content\") pod \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\" (UID: \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\") " Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.101338 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-utilities\") pod \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\" (UID: \"1224aaad-927b-4ead-91b4-9fa65f7ea6c5\") " Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.102198 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-utilities" (OuterVolumeSpecName: "utilities") pod "1224aaad-927b-4ead-91b4-9fa65f7ea6c5" (UID: "1224aaad-927b-4ead-91b4-9fa65f7ea6c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.112477 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-kube-api-access-9qjq8" (OuterVolumeSpecName: "kube-api-access-9qjq8") pod "1224aaad-927b-4ead-91b4-9fa65f7ea6c5" (UID: "1224aaad-927b-4ead-91b4-9fa65f7ea6c5"). InnerVolumeSpecName "kube-api-access-9qjq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.150712 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1224aaad-927b-4ead-91b4-9fa65f7ea6c5" (UID: "1224aaad-927b-4ead-91b4-9fa65f7ea6c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.203238 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.203271 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.203286 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qjq8\" (UniqueName: \"kubernetes.io/projected/1224aaad-927b-4ead-91b4-9fa65f7ea6c5-kube-api-access-9qjq8\") on node \"crc\" DevicePath \"\"" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.449880 4749 generic.go:334] "Generic (PLEG): container finished" podID="1224aaad-927b-4ead-91b4-9fa65f7ea6c5" containerID="4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693" exitCode=0 Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.449920 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrmkr" event={"ID":"1224aaad-927b-4ead-91b4-9fa65f7ea6c5","Type":"ContainerDied","Data":"4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693"} Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.449952 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jrmkr" event={"ID":"1224aaad-927b-4ead-91b4-9fa65f7ea6c5","Type":"ContainerDied","Data":"971a8edfce91bdf0b5ec3580b483befa0f144b515d7896ad87a7d2a88362c93e"} Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.449968 4749 scope.go:117] "RemoveContainer" containerID="4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.450004 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jrmkr" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.478322 4749 scope.go:117] "RemoveContainer" containerID="cfa9d3f2d76b187bc7d695eab59039c026838d8ae540d452610eb62d38ff7cd0" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.488678 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jrmkr"] Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.502725 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jrmkr"] Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.508479 4749 scope.go:117] "RemoveContainer" containerID="a99024b297a4f5f1282efb20d15af850ecb677b90824ede0d4f8071a77d91749" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.556890 4749 scope.go:117] "RemoveContainer" containerID="4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693" Oct 01 14:33:26 crc kubenswrapper[4749]: E1001 14:33:26.557382 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693\": container with ID starting with 4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693 not found: ID does not exist" containerID="4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.557411 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693"} err="failed to get container status \"4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693\": rpc error: code = NotFound desc = could not find container \"4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693\": container with ID starting with 4e92f0bd09dfe76c00aea96b733a99ed0010b0ab2b24c2402c68003cb6276693 not found: ID does not exist" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.557450 4749 scope.go:117] "RemoveContainer" containerID="cfa9d3f2d76b187bc7d695eab59039c026838d8ae540d452610eb62d38ff7cd0" Oct 01 14:33:26 crc kubenswrapper[4749]: E1001 14:33:26.557747 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa9d3f2d76b187bc7d695eab59039c026838d8ae540d452610eb62d38ff7cd0\": container with ID starting with cfa9d3f2d76b187bc7d695eab59039c026838d8ae540d452610eb62d38ff7cd0 not found: ID does not exist" containerID="cfa9d3f2d76b187bc7d695eab59039c026838d8ae540d452610eb62d38ff7cd0" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.557797 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa9d3f2d76b187bc7d695eab59039c026838d8ae540d452610eb62d38ff7cd0"} err="failed to get container status \"cfa9d3f2d76b187bc7d695eab59039c026838d8ae540d452610eb62d38ff7cd0\": rpc error: code = NotFound desc = could not find container \"cfa9d3f2d76b187bc7d695eab59039c026838d8ae540d452610eb62d38ff7cd0\": container with ID starting with cfa9d3f2d76b187bc7d695eab59039c026838d8ae540d452610eb62d38ff7cd0 not found: ID does not exist" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.557825 4749 scope.go:117] "RemoveContainer" containerID="a99024b297a4f5f1282efb20d15af850ecb677b90824ede0d4f8071a77d91749" Oct 01 14:33:26 crc kubenswrapper[4749]: E1001 14:33:26.558113 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a99024b297a4f5f1282efb20d15af850ecb677b90824ede0d4f8071a77d91749\": container with ID starting with a99024b297a4f5f1282efb20d15af850ecb677b90824ede0d4f8071a77d91749 not found: ID does not exist" containerID="a99024b297a4f5f1282efb20d15af850ecb677b90824ede0d4f8071a77d91749" Oct 01 14:33:26 crc kubenswrapper[4749]: I1001 14:33:26.558140 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a99024b297a4f5f1282efb20d15af850ecb677b90824ede0d4f8071a77d91749"} err="failed to get container status \"a99024b297a4f5f1282efb20d15af850ecb677b90824ede0d4f8071a77d91749\": rpc error: code = NotFound desc = could not find container \"a99024b297a4f5f1282efb20d15af850ecb677b90824ede0d4f8071a77d91749\": container with ID starting with a99024b297a4f5f1282efb20d15af850ecb677b90824ede0d4f8071a77d91749 not found: ID does not exist" Oct 01 14:33:27 crc kubenswrapper[4749]: I1001 14:33:27.230764 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:33:27 crc kubenswrapper[4749]: E1001 14:33:27.231567 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:33:27 crc kubenswrapper[4749]: I1001 14:33:27.249903 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1224aaad-927b-4ead-91b4-9fa65f7ea6c5" path="/var/lib/kubelet/pods/1224aaad-927b-4ead-91b4-9fa65f7ea6c5/volumes" Oct 01 14:33:39 crc kubenswrapper[4749]: I1001 14:33:39.230429 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:33:39 crc kubenswrapper[4749]: I1001 14:33:39.625740 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"a71f1718319b85748eba0f8b6e750caf7e6a60e5c280b5c4b47fe5205b47f308"} Oct 01 14:36:02 crc kubenswrapper[4749]: I1001 14:36:02.106349 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:36:02 crc kubenswrapper[4749]: I1001 14:36:02.107180 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.440263 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tjl9p"] Oct 01 14:36:30 crc kubenswrapper[4749]: E1001 14:36:30.441343 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1224aaad-927b-4ead-91b4-9fa65f7ea6c5" containerName="extract-utilities" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.441360 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1224aaad-927b-4ead-91b4-9fa65f7ea6c5" containerName="extract-utilities" Oct 01 14:36:30 crc kubenswrapper[4749]: E1001 14:36:30.441370 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1224aaad-927b-4ead-91b4-9fa65f7ea6c5" containerName="registry-server" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.441377 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1224aaad-927b-4ead-91b4-9fa65f7ea6c5" containerName="registry-server" Oct 01 14:36:30 crc kubenswrapper[4749]: E1001 14:36:30.441401 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1224aaad-927b-4ead-91b4-9fa65f7ea6c5" containerName="extract-content" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.441409 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="1224aaad-927b-4ead-91b4-9fa65f7ea6c5" containerName="extract-content" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.441640 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="1224aaad-927b-4ead-91b4-9fa65f7ea6c5" containerName="registry-server" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.443023 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.470666 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tjl9p"] Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.573648 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-catalog-content\") pod \"certified-operators-tjl9p\" (UID: \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\") " pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.573716 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8k92\" (UniqueName: \"kubernetes.io/projected/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-kube-api-access-r8k92\") pod \"certified-operators-tjl9p\" (UID: \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\") " pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.573876 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-utilities\") pod \"certified-operators-tjl9p\" (UID: \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\") " pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.676928 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-utilities\") pod \"certified-operators-tjl9p\" (UID: \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\") " pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.677076 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-catalog-content\") pod \"certified-operators-tjl9p\" (UID: \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\") " pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.677110 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8k92\" (UniqueName: \"kubernetes.io/projected/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-kube-api-access-r8k92\") pod \"certified-operators-tjl9p\" (UID: \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\") " pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.677680 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-utilities\") pod \"certified-operators-tjl9p\" (UID: \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\") " pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.677800 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-catalog-content\") pod \"certified-operators-tjl9p\" (UID: \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\") " pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.706275 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8k92\" (UniqueName: \"kubernetes.io/projected/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-kube-api-access-r8k92\") pod \"certified-operators-tjl9p\" (UID: \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\") " pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:30 crc kubenswrapper[4749]: I1001 14:36:30.815437 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:31 crc kubenswrapper[4749]: I1001 14:36:31.361086 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tjl9p"] Oct 01 14:36:32 crc kubenswrapper[4749]: I1001 14:36:32.106948 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:36:32 crc kubenswrapper[4749]: I1001 14:36:32.107022 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:36:32 crc kubenswrapper[4749]: I1001 14:36:32.543966 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b03866a-cbdd-4a6a-8b64-24a8d37ada49" containerID="95bffef46323a140736a1627a4d987f8af56ad8b9be492307c41b9514da228f8" exitCode=0 Oct 01 14:36:32 crc kubenswrapper[4749]: I1001 14:36:32.544052 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjl9p" event={"ID":"7b03866a-cbdd-4a6a-8b64-24a8d37ada49","Type":"ContainerDied","Data":"95bffef46323a140736a1627a4d987f8af56ad8b9be492307c41b9514da228f8"} Oct 01 14:36:32 crc kubenswrapper[4749]: I1001 14:36:32.544102 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjl9p" event={"ID":"7b03866a-cbdd-4a6a-8b64-24a8d37ada49","Type":"ContainerStarted","Data":"50b808d35fc6210d19c154092d0b2377fe0239c8458a861a89d90f47180b3853"} Oct 01 14:36:33 crc kubenswrapper[4749]: I1001 14:36:33.561361 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjl9p" event={"ID":"7b03866a-cbdd-4a6a-8b64-24a8d37ada49","Type":"ContainerStarted","Data":"ef7179cbe06e6e70c75d7cc263d157a83eb51610b33e455a303de06c267bf31c"} Oct 01 14:36:35 crc kubenswrapper[4749]: I1001 14:36:35.582768 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b03866a-cbdd-4a6a-8b64-24a8d37ada49" containerID="ef7179cbe06e6e70c75d7cc263d157a83eb51610b33e455a303de06c267bf31c" exitCode=0 Oct 01 14:36:35 crc kubenswrapper[4749]: I1001 14:36:35.582873 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjl9p" event={"ID":"7b03866a-cbdd-4a6a-8b64-24a8d37ada49","Type":"ContainerDied","Data":"ef7179cbe06e6e70c75d7cc263d157a83eb51610b33e455a303de06c267bf31c"} Oct 01 14:36:36 crc kubenswrapper[4749]: I1001 14:36:36.595504 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjl9p" event={"ID":"7b03866a-cbdd-4a6a-8b64-24a8d37ada49","Type":"ContainerStarted","Data":"9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1"} Oct 01 14:36:36 crc kubenswrapper[4749]: I1001 14:36:36.627746 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tjl9p" podStartSLOduration=3.180822104 podStartE2EDuration="6.627729467s" podCreationTimestamp="2025-10-01 14:36:30 +0000 UTC" firstStartedPulling="2025-10-01 14:36:32.546821486 +0000 UTC m=+5452.600806425" lastFinishedPulling="2025-10-01 14:36:35.993728889 +0000 UTC m=+5456.047713788" observedRunningTime="2025-10-01 14:36:36.619918512 +0000 UTC m=+5456.673903411" watchObservedRunningTime="2025-10-01 14:36:36.627729467 +0000 UTC m=+5456.681714366" Oct 01 14:36:40 crc kubenswrapper[4749]: I1001 14:36:40.816429 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:40 crc kubenswrapper[4749]: I1001 14:36:40.817205 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:40 crc kubenswrapper[4749]: I1001 14:36:40.895747 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:41 crc kubenswrapper[4749]: I1001 14:36:41.746207 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:43 crc kubenswrapper[4749]: I1001 14:36:43.627786 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tjl9p"] Oct 01 14:36:43 crc kubenswrapper[4749]: I1001 14:36:43.696682 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tjl9p" podUID="7b03866a-cbdd-4a6a-8b64-24a8d37ada49" containerName="registry-server" containerID="cri-o://9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1" gracePeriod=2 Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.199869 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.360842 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-catalog-content\") pod \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\" (UID: \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\") " Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.363182 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8k92\" (UniqueName: \"kubernetes.io/projected/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-kube-api-access-r8k92\") pod \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\" (UID: \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\") " Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.364662 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-utilities\") pod \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\" (UID: \"7b03866a-cbdd-4a6a-8b64-24a8d37ada49\") " Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.366147 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-utilities" (OuterVolumeSpecName: "utilities") pod "7b03866a-cbdd-4a6a-8b64-24a8d37ada49" (UID: "7b03866a-cbdd-4a6a-8b64-24a8d37ada49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.379814 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-kube-api-access-r8k92" (OuterVolumeSpecName: "kube-api-access-r8k92") pod "7b03866a-cbdd-4a6a-8b64-24a8d37ada49" (UID: "7b03866a-cbdd-4a6a-8b64-24a8d37ada49"). InnerVolumeSpecName "kube-api-access-r8k92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.423418 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b03866a-cbdd-4a6a-8b64-24a8d37ada49" (UID: "7b03866a-cbdd-4a6a-8b64-24a8d37ada49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.468132 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.468542 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.468553 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8k92\" (UniqueName: \"kubernetes.io/projected/7b03866a-cbdd-4a6a-8b64-24a8d37ada49-kube-api-access-r8k92\") on node \"crc\" DevicePath \"\"" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.710537 4749 generic.go:334] "Generic (PLEG): container finished" podID="7b03866a-cbdd-4a6a-8b64-24a8d37ada49" containerID="9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1" exitCode=0 Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.710594 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjl9p" event={"ID":"7b03866a-cbdd-4a6a-8b64-24a8d37ada49","Type":"ContainerDied","Data":"9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1"} Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.710636 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tjl9p" event={"ID":"7b03866a-cbdd-4a6a-8b64-24a8d37ada49","Type":"ContainerDied","Data":"50b808d35fc6210d19c154092d0b2377fe0239c8458a861a89d90f47180b3853"} Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.710656 4749 scope.go:117] "RemoveContainer" containerID="9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.710647 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tjl9p" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.740563 4749 scope.go:117] "RemoveContainer" containerID="ef7179cbe06e6e70c75d7cc263d157a83eb51610b33e455a303de06c267bf31c" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.760030 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tjl9p"] Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.769317 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tjl9p"] Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.774392 4749 scope.go:117] "RemoveContainer" containerID="95bffef46323a140736a1627a4d987f8af56ad8b9be492307c41b9514da228f8" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.815443 4749 scope.go:117] "RemoveContainer" containerID="9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1" Oct 01 14:36:44 crc kubenswrapper[4749]: E1001 14:36:44.816744 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1\": container with ID starting with 9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1 not found: ID does not exist" containerID="9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.816786 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1"} err="failed to get container status \"9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1\": rpc error: code = NotFound desc = could not find container \"9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1\": container with ID starting with 9ce1ab9f577a91c15db841d5e30a72b0b3a10e512a45392ce243e7547e93d0b1 not found: ID does not exist" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.816810 4749 scope.go:117] "RemoveContainer" containerID="ef7179cbe06e6e70c75d7cc263d157a83eb51610b33e455a303de06c267bf31c" Oct 01 14:36:44 crc kubenswrapper[4749]: E1001 14:36:44.817107 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef7179cbe06e6e70c75d7cc263d157a83eb51610b33e455a303de06c267bf31c\": container with ID starting with ef7179cbe06e6e70c75d7cc263d157a83eb51610b33e455a303de06c267bf31c not found: ID does not exist" containerID="ef7179cbe06e6e70c75d7cc263d157a83eb51610b33e455a303de06c267bf31c" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.817134 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7179cbe06e6e70c75d7cc263d157a83eb51610b33e455a303de06c267bf31c"} err="failed to get container status \"ef7179cbe06e6e70c75d7cc263d157a83eb51610b33e455a303de06c267bf31c\": rpc error: code = NotFound desc = could not find container \"ef7179cbe06e6e70c75d7cc263d157a83eb51610b33e455a303de06c267bf31c\": container with ID starting with ef7179cbe06e6e70c75d7cc263d157a83eb51610b33e455a303de06c267bf31c not found: ID does not exist" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.817151 4749 scope.go:117] "RemoveContainer" containerID="95bffef46323a140736a1627a4d987f8af56ad8b9be492307c41b9514da228f8" Oct 01 14:36:44 crc kubenswrapper[4749]: E1001 14:36:44.817538 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95bffef46323a140736a1627a4d987f8af56ad8b9be492307c41b9514da228f8\": container with ID starting with 95bffef46323a140736a1627a4d987f8af56ad8b9be492307c41b9514da228f8 not found: ID does not exist" containerID="95bffef46323a140736a1627a4d987f8af56ad8b9be492307c41b9514da228f8" Oct 01 14:36:44 crc kubenswrapper[4749]: I1001 14:36:44.817569 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95bffef46323a140736a1627a4d987f8af56ad8b9be492307c41b9514da228f8"} err="failed to get container status \"95bffef46323a140736a1627a4d987f8af56ad8b9be492307c41b9514da228f8\": rpc error: code = NotFound desc = could not find container \"95bffef46323a140736a1627a4d987f8af56ad8b9be492307c41b9514da228f8\": container with ID starting with 95bffef46323a140736a1627a4d987f8af56ad8b9be492307c41b9514da228f8 not found: ID does not exist" Oct 01 14:36:45 crc kubenswrapper[4749]: I1001 14:36:45.243648 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b03866a-cbdd-4a6a-8b64-24a8d37ada49" path="/var/lib/kubelet/pods/7b03866a-cbdd-4a6a-8b64-24a8d37ada49/volumes" Oct 01 14:37:02 crc kubenswrapper[4749]: I1001 14:37:02.106994 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:37:02 crc kubenswrapper[4749]: I1001 14:37:02.107856 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:37:02 crc kubenswrapper[4749]: I1001 14:37:02.107925 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 14:37:02 crc kubenswrapper[4749]: I1001 14:37:02.108949 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a71f1718319b85748eba0f8b6e750caf7e6a60e5c280b5c4b47fe5205b47f308"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:37:02 crc kubenswrapper[4749]: I1001 14:37:02.109029 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://a71f1718319b85748eba0f8b6e750caf7e6a60e5c280b5c4b47fe5205b47f308" gracePeriod=600 Oct 01 14:37:02 crc kubenswrapper[4749]: I1001 14:37:02.911816 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="a71f1718319b85748eba0f8b6e750caf7e6a60e5c280b5c4b47fe5205b47f308" exitCode=0 Oct 01 14:37:02 crc kubenswrapper[4749]: I1001 14:37:02.912109 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"a71f1718319b85748eba0f8b6e750caf7e6a60e5c280b5c4b47fe5205b47f308"} Oct 01 14:37:02 crc kubenswrapper[4749]: I1001 14:37:02.912540 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5"} Oct 01 14:37:02 crc kubenswrapper[4749]: I1001 14:37:02.912571 4749 scope.go:117] "RemoveContainer" containerID="0cc3ca23f95898f390808db4459bc951633293fe49c902b3d5282d41edab5a42" Oct 01 14:39:02 crc kubenswrapper[4749]: I1001 14:39:02.106175 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:39:02 crc kubenswrapper[4749]: I1001 14:39:02.106859 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:39:32 crc kubenswrapper[4749]: I1001 14:39:32.107087 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:39:32 crc kubenswrapper[4749]: I1001 14:39:32.107910 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:40:02 crc kubenswrapper[4749]: I1001 14:40:02.106977 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:40:02 crc kubenswrapper[4749]: I1001 14:40:02.107558 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:40:02 crc kubenswrapper[4749]: I1001 14:40:02.107643 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 14:40:02 crc kubenswrapper[4749]: I1001 14:40:02.108506 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:40:02 crc kubenswrapper[4749]: I1001 14:40:02.108572 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" gracePeriod=600 Oct 01 14:40:02 crc kubenswrapper[4749]: E1001 14:40:02.230132 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:40:03 crc kubenswrapper[4749]: I1001 14:40:03.100696 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" exitCode=0 Oct 01 14:40:03 crc kubenswrapper[4749]: I1001 14:40:03.100771 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5"} Oct 01 14:40:03 crc kubenswrapper[4749]: I1001 14:40:03.100880 4749 scope.go:117] "RemoveContainer" containerID="a71f1718319b85748eba0f8b6e750caf7e6a60e5c280b5c4b47fe5205b47f308" Oct 01 14:40:03 crc kubenswrapper[4749]: I1001 14:40:03.101910 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:40:03 crc kubenswrapper[4749]: E1001 14:40:03.102389 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:40:07 crc kubenswrapper[4749]: I1001 14:40:07.154768 4749 generic.go:334] "Generic (PLEG): container finished" podID="fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" containerID="848c24291842e13f039a155fa65ab0508347882de0b767542099dbcc3c422de4" exitCode=0 Oct 01 14:40:07 crc kubenswrapper[4749]: I1001 14:40:07.154917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d","Type":"ContainerDied","Data":"848c24291842e13f039a155fa65ab0508347882de0b767542099dbcc3c422de4"} Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.605206 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.772054 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-test-operator-ephemeral-temporary\") pod \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.772141 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.772416 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-config-data\") pod \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.772510 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-ssh-key\") pod \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.772543 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-openstack-config-secret\") pod \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.772951 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-ca-certs\") pod \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.772988 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-test-operator-ephemeral-workdir\") pod \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.773000 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" (UID: "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.773090 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg9pn\" (UniqueName: \"kubernetes.io/projected/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-kube-api-access-mg9pn\") pod \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.773651 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-openstack-config\") pod \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\" (UID: \"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d\") " Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.774017 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-config-data" (OuterVolumeSpecName: "config-data") pod "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" (UID: "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.774831 4749 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.774920 4749 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.779641 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" (UID: "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.779895 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" (UID: "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.780120 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-kube-api-access-mg9pn" (OuterVolumeSpecName: "kube-api-access-mg9pn") pod "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" (UID: "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d"). InnerVolumeSpecName "kube-api-access-mg9pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.817983 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" (UID: "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.826641 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" (UID: "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.831339 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" (UID: "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.852987 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" (UID: "fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.876858 4749 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.876894 4749 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.876913 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.876925 4749 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.876936 4749 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.876948 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg9pn\" (UniqueName: \"kubernetes.io/projected/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-kube-api-access-mg9pn\") on node \"crc\" DevicePath \"\"" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.876960 4749 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.902096 4749 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 01 14:40:08 crc kubenswrapper[4749]: I1001 14:40:08.978892 4749 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 01 14:40:09 crc kubenswrapper[4749]: I1001 14:40:09.193569 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d","Type":"ContainerDied","Data":"15c30f8d386f30b556bc7e303126366e4eb5b0ddf63bbfbfe9c12678efe242fa"} Oct 01 14:40:09 crc kubenswrapper[4749]: I1001 14:40:09.194056 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15c30f8d386f30b556bc7e303126366e4eb5b0ddf63bbfbfe9c12678efe242fa" Oct 01 14:40:09 crc kubenswrapper[4749]: I1001 14:40:09.193599 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 14:40:16 crc kubenswrapper[4749]: I1001 14:40:16.229801 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:40:16 crc kubenswrapper[4749]: E1001 14:40:16.230845 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.736990 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 14:40:18 crc kubenswrapper[4749]: E1001 14:40:18.737967 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b03866a-cbdd-4a6a-8b64-24a8d37ada49" containerName="extract-utilities" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.737991 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b03866a-cbdd-4a6a-8b64-24a8d37ada49" containerName="extract-utilities" Oct 01 14:40:18 crc kubenswrapper[4749]: E1001 14:40:18.738030 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" containerName="tempest-tests-tempest-tests-runner" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.738044 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" containerName="tempest-tests-tempest-tests-runner" Oct 01 14:40:18 crc kubenswrapper[4749]: E1001 14:40:18.738066 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b03866a-cbdd-4a6a-8b64-24a8d37ada49" containerName="extract-content" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.738078 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b03866a-cbdd-4a6a-8b64-24a8d37ada49" containerName="extract-content" Oct 01 14:40:18 crc kubenswrapper[4749]: E1001 14:40:18.738098 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b03866a-cbdd-4a6a-8b64-24a8d37ada49" containerName="registry-server" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.738109 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b03866a-cbdd-4a6a-8b64-24a8d37ada49" containerName="registry-server" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.738491 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d" containerName="tempest-tests-tempest-tests-runner" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.738521 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b03866a-cbdd-4a6a-8b64-24a8d37ada49" containerName="registry-server" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.739686 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.745852 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-sfnbt" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.753933 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.895709 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1390b811-1714-4e90-9491-e38ba4b0a530\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.895854 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xtfm\" (UniqueName: \"kubernetes.io/projected/1390b811-1714-4e90-9491-e38ba4b0a530-kube-api-access-8xtfm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1390b811-1714-4e90-9491-e38ba4b0a530\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.998933 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1390b811-1714-4e90-9491-e38ba4b0a530\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.999330 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xtfm\" (UniqueName: \"kubernetes.io/projected/1390b811-1714-4e90-9491-e38ba4b0a530-kube-api-access-8xtfm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1390b811-1714-4e90-9491-e38ba4b0a530\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:40:18 crc kubenswrapper[4749]: I1001 14:40:18.999618 4749 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1390b811-1714-4e90-9491-e38ba4b0a530\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:40:19 crc kubenswrapper[4749]: I1001 14:40:19.026568 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xtfm\" (UniqueName: \"kubernetes.io/projected/1390b811-1714-4e90-9491-e38ba4b0a530-kube-api-access-8xtfm\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1390b811-1714-4e90-9491-e38ba4b0a530\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:40:19 crc kubenswrapper[4749]: I1001 14:40:19.050665 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1390b811-1714-4e90-9491-e38ba4b0a530\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:40:19 crc kubenswrapper[4749]: I1001 14:40:19.081591 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 14:40:19 crc kubenswrapper[4749]: I1001 14:40:19.633805 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 14:40:19 crc kubenswrapper[4749]: I1001 14:40:19.639303 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:40:20 crc kubenswrapper[4749]: I1001 14:40:20.318728 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1390b811-1714-4e90-9491-e38ba4b0a530","Type":"ContainerStarted","Data":"fe764c69847f0285284712d3e13a6e09b6bbd18e2680f29449e4e7e768af36ef"} Oct 01 14:40:21 crc kubenswrapper[4749]: I1001 14:40:21.332369 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1390b811-1714-4e90-9491-e38ba4b0a530","Type":"ContainerStarted","Data":"92872c269f5a2045de9c5b96c68600ed931b74e07f70902f2fde0af6dff78ea2"} Oct 01 14:40:21 crc kubenswrapper[4749]: I1001 14:40:21.356191 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.394085356 podStartE2EDuration="3.356165702s" podCreationTimestamp="2025-10-01 14:40:18 +0000 UTC" firstStartedPulling="2025-10-01 14:40:19.639016243 +0000 UTC m=+5679.693001152" lastFinishedPulling="2025-10-01 14:40:20.601096569 +0000 UTC m=+5680.655081498" observedRunningTime="2025-10-01 14:40:21.346356629 +0000 UTC m=+5681.400341598" watchObservedRunningTime="2025-10-01 14:40:21.356165702 +0000 UTC m=+5681.410150621" Oct 01 14:40:29 crc kubenswrapper[4749]: I1001 14:40:29.230187 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:40:29 crc kubenswrapper[4749]: E1001 14:40:29.231297 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:40:40 crc kubenswrapper[4749]: I1001 14:40:40.931803 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lqrjw/must-gather-xdlwh"] Oct 01 14:40:40 crc kubenswrapper[4749]: I1001 14:40:40.934103 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/must-gather-xdlwh" Oct 01 14:40:40 crc kubenswrapper[4749]: I1001 14:40:40.936740 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lqrjw"/"openshift-service-ca.crt" Oct 01 14:40:40 crc kubenswrapper[4749]: I1001 14:40:40.938207 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lqrjw"/"default-dockercfg-n4z6s" Oct 01 14:40:40 crc kubenswrapper[4749]: I1001 14:40:40.938821 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lqrjw"/"kube-root-ca.crt" Oct 01 14:40:40 crc kubenswrapper[4749]: I1001 14:40:40.945746 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lqrjw/must-gather-xdlwh"] Oct 01 14:40:41 crc kubenswrapper[4749]: I1001 14:40:41.005640 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c3367109-f7d7-4296-9717-853f4700b93a-must-gather-output\") pod \"must-gather-xdlwh\" (UID: \"c3367109-f7d7-4296-9717-853f4700b93a\") " pod="openshift-must-gather-lqrjw/must-gather-xdlwh" Oct 01 14:40:41 crc kubenswrapper[4749]: I1001 14:40:41.005793 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwv6g\" (UniqueName: \"kubernetes.io/projected/c3367109-f7d7-4296-9717-853f4700b93a-kube-api-access-cwv6g\") pod \"must-gather-xdlwh\" (UID: \"c3367109-f7d7-4296-9717-853f4700b93a\") " pod="openshift-must-gather-lqrjw/must-gather-xdlwh" Oct 01 14:40:41 crc kubenswrapper[4749]: I1001 14:40:41.107607 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwv6g\" (UniqueName: \"kubernetes.io/projected/c3367109-f7d7-4296-9717-853f4700b93a-kube-api-access-cwv6g\") pod \"must-gather-xdlwh\" (UID: \"c3367109-f7d7-4296-9717-853f4700b93a\") " pod="openshift-must-gather-lqrjw/must-gather-xdlwh" Oct 01 14:40:41 crc kubenswrapper[4749]: I1001 14:40:41.107767 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c3367109-f7d7-4296-9717-853f4700b93a-must-gather-output\") pod \"must-gather-xdlwh\" (UID: \"c3367109-f7d7-4296-9717-853f4700b93a\") " pod="openshift-must-gather-lqrjw/must-gather-xdlwh" Oct 01 14:40:41 crc kubenswrapper[4749]: I1001 14:40:41.108315 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c3367109-f7d7-4296-9717-853f4700b93a-must-gather-output\") pod \"must-gather-xdlwh\" (UID: \"c3367109-f7d7-4296-9717-853f4700b93a\") " pod="openshift-must-gather-lqrjw/must-gather-xdlwh" Oct 01 14:40:41 crc kubenswrapper[4749]: I1001 14:40:41.132548 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwv6g\" (UniqueName: \"kubernetes.io/projected/c3367109-f7d7-4296-9717-853f4700b93a-kube-api-access-cwv6g\") pod \"must-gather-xdlwh\" (UID: \"c3367109-f7d7-4296-9717-853f4700b93a\") " pod="openshift-must-gather-lqrjw/must-gather-xdlwh" Oct 01 14:40:41 crc kubenswrapper[4749]: I1001 14:40:41.259189 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lqrjw"/"default-dockercfg-n4z6s" Oct 01 14:40:41 crc kubenswrapper[4749]: I1001 14:40:41.268271 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/must-gather-xdlwh" Oct 01 14:40:41 crc kubenswrapper[4749]: I1001 14:40:41.737683 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lqrjw/must-gather-xdlwh"] Oct 01 14:40:42 crc kubenswrapper[4749]: I1001 14:40:42.592710 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/must-gather-xdlwh" event={"ID":"c3367109-f7d7-4296-9717-853f4700b93a","Type":"ContainerStarted","Data":"3e3bcec0423d7ad36843ef6cd72157ae49702356b2f110ca18df948dfe543c55"} Oct 01 14:40:44 crc kubenswrapper[4749]: I1001 14:40:44.229667 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:40:44 crc kubenswrapper[4749]: E1001 14:40:44.230242 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:40:48 crc kubenswrapper[4749]: I1001 14:40:48.669277 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/must-gather-xdlwh" event={"ID":"c3367109-f7d7-4296-9717-853f4700b93a","Type":"ContainerStarted","Data":"8fcb2d37fdc00caecad3ba3efbd05b2056b53f09aaedab1758199d95a68b69ac"} Oct 01 14:40:49 crc kubenswrapper[4749]: I1001 14:40:49.682305 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/must-gather-xdlwh" event={"ID":"c3367109-f7d7-4296-9717-853f4700b93a","Type":"ContainerStarted","Data":"7480aed2048c05b8a28a51b2ee3358c7a52ed2d7b02491e36218a3cfdd038cc0"} Oct 01 14:40:49 crc kubenswrapper[4749]: I1001 14:40:49.702686 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lqrjw/must-gather-xdlwh" podStartSLOduration=3.200912866 podStartE2EDuration="9.702661348s" podCreationTimestamp="2025-10-01 14:40:40 +0000 UTC" firstStartedPulling="2025-10-01 14:40:41.742419733 +0000 UTC m=+5701.796404632" lastFinishedPulling="2025-10-01 14:40:48.244168185 +0000 UTC m=+5708.298153114" observedRunningTime="2025-10-01 14:40:49.697907121 +0000 UTC m=+5709.751892050" watchObservedRunningTime="2025-10-01 14:40:49.702661348 +0000 UTC m=+5709.756646277" Oct 01 14:40:53 crc kubenswrapper[4749]: I1001 14:40:53.099632 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lqrjw/crc-debug-z2nf9"] Oct 01 14:40:53 crc kubenswrapper[4749]: I1001 14:40:53.101562 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" Oct 01 14:40:53 crc kubenswrapper[4749]: I1001 14:40:53.260360 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c080998-4601-4515-9938-c83aa425429d-host\") pod \"crc-debug-z2nf9\" (UID: \"0c080998-4601-4515-9938-c83aa425429d\") " pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" Oct 01 14:40:53 crc kubenswrapper[4749]: I1001 14:40:53.260435 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqcwx\" (UniqueName: \"kubernetes.io/projected/0c080998-4601-4515-9938-c83aa425429d-kube-api-access-lqcwx\") pod \"crc-debug-z2nf9\" (UID: \"0c080998-4601-4515-9938-c83aa425429d\") " pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" Oct 01 14:40:53 crc kubenswrapper[4749]: I1001 14:40:53.362253 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqcwx\" (UniqueName: \"kubernetes.io/projected/0c080998-4601-4515-9938-c83aa425429d-kube-api-access-lqcwx\") pod \"crc-debug-z2nf9\" (UID: \"0c080998-4601-4515-9938-c83aa425429d\") " pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" Oct 01 14:40:53 crc kubenswrapper[4749]: I1001 14:40:53.362967 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c080998-4601-4515-9938-c83aa425429d-host\") pod \"crc-debug-z2nf9\" (UID: \"0c080998-4601-4515-9938-c83aa425429d\") " pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" Oct 01 14:40:53 crc kubenswrapper[4749]: I1001 14:40:53.363048 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c080998-4601-4515-9938-c83aa425429d-host\") pod \"crc-debug-z2nf9\" (UID: \"0c080998-4601-4515-9938-c83aa425429d\") " pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" Oct 01 14:40:53 crc kubenswrapper[4749]: I1001 14:40:53.385188 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqcwx\" (UniqueName: \"kubernetes.io/projected/0c080998-4601-4515-9938-c83aa425429d-kube-api-access-lqcwx\") pod \"crc-debug-z2nf9\" (UID: \"0c080998-4601-4515-9938-c83aa425429d\") " pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" Oct 01 14:40:53 crc kubenswrapper[4749]: I1001 14:40:53.419413 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" Oct 01 14:40:53 crc kubenswrapper[4749]: I1001 14:40:53.720089 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" event={"ID":"0c080998-4601-4515-9938-c83aa425429d","Type":"ContainerStarted","Data":"836b4a88460d9194d45c0b6357d212f7f75e2cbbec5ec25c0068241d6b53264b"} Oct 01 14:40:55 crc kubenswrapper[4749]: E1001 14:40:55.115298 4749 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.220:60568->38.102.83.220:34693: write tcp 38.102.83.220:60568->38.102.83.220:34693: write: broken pipe Oct 01 14:40:57 crc kubenswrapper[4749]: I1001 14:40:57.229754 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:40:57 crc kubenswrapper[4749]: E1001 14:40:57.230240 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:41:04 crc kubenswrapper[4749]: I1001 14:41:04.835384 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" event={"ID":"0c080998-4601-4515-9938-c83aa425429d","Type":"ContainerStarted","Data":"cb3b464281fa1657283409f5cf05e8cfdc8e2623fcbd79af2a764289b7db20b3"} Oct 01 14:41:04 crc kubenswrapper[4749]: I1001 14:41:04.859438 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" podStartSLOduration=1.518920124 podStartE2EDuration="11.859414789s" podCreationTimestamp="2025-10-01 14:40:53 +0000 UTC" firstStartedPulling="2025-10-01 14:40:53.477195868 +0000 UTC m=+5713.531180767" lastFinishedPulling="2025-10-01 14:41:03.817690533 +0000 UTC m=+5723.871675432" observedRunningTime="2025-10-01 14:41:04.847402772 +0000 UTC m=+5724.901387692" watchObservedRunningTime="2025-10-01 14:41:04.859414789 +0000 UTC m=+5724.913399688" Oct 01 14:41:08 crc kubenswrapper[4749]: I1001 14:41:08.231289 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:41:08 crc kubenswrapper[4749]: E1001 14:41:08.232165 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:41:20 crc kubenswrapper[4749]: I1001 14:41:20.229654 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:41:20 crc kubenswrapper[4749]: E1001 14:41:20.230542 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:41:34 crc kubenswrapper[4749]: I1001 14:41:34.230044 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:41:34 crc kubenswrapper[4749]: E1001 14:41:34.231564 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:41:47 crc kubenswrapper[4749]: I1001 14:41:47.229819 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:41:47 crc kubenswrapper[4749]: E1001 14:41:47.230568 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:41:58 crc kubenswrapper[4749]: I1001 14:41:58.229982 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:41:58 crc kubenswrapper[4749]: E1001 14:41:58.230800 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:42:06 crc kubenswrapper[4749]: I1001 14:42:06.923710 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qqwq5"] Oct 01 14:42:06 crc kubenswrapper[4749]: I1001 14:42:06.938317 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:06 crc kubenswrapper[4749]: I1001 14:42:06.956985 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqwq5"] Oct 01 14:42:07 crc kubenswrapper[4749]: I1001 14:42:07.036249 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqg69\" (UniqueName: \"kubernetes.io/projected/5ba20c94-5658-4c67-b52f-88722cd86e4c-kube-api-access-rqg69\") pod \"redhat-marketplace-qqwq5\" (UID: \"5ba20c94-5658-4c67-b52f-88722cd86e4c\") " pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:07 crc kubenswrapper[4749]: I1001 14:42:07.036364 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ba20c94-5658-4c67-b52f-88722cd86e4c-catalog-content\") pod \"redhat-marketplace-qqwq5\" (UID: \"5ba20c94-5658-4c67-b52f-88722cd86e4c\") " pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:07 crc kubenswrapper[4749]: I1001 14:42:07.036415 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ba20c94-5658-4c67-b52f-88722cd86e4c-utilities\") pod \"redhat-marketplace-qqwq5\" (UID: \"5ba20c94-5658-4c67-b52f-88722cd86e4c\") " pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:07 crc kubenswrapper[4749]: I1001 14:42:07.138032 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ba20c94-5658-4c67-b52f-88722cd86e4c-catalog-content\") pod \"redhat-marketplace-qqwq5\" (UID: \"5ba20c94-5658-4c67-b52f-88722cd86e4c\") " pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:07 crc kubenswrapper[4749]: I1001 14:42:07.138104 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ba20c94-5658-4c67-b52f-88722cd86e4c-utilities\") pod \"redhat-marketplace-qqwq5\" (UID: \"5ba20c94-5658-4c67-b52f-88722cd86e4c\") " pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:07 crc kubenswrapper[4749]: I1001 14:42:07.138247 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqg69\" (UniqueName: \"kubernetes.io/projected/5ba20c94-5658-4c67-b52f-88722cd86e4c-kube-api-access-rqg69\") pod \"redhat-marketplace-qqwq5\" (UID: \"5ba20c94-5658-4c67-b52f-88722cd86e4c\") " pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:07 crc kubenswrapper[4749]: I1001 14:42:07.138734 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ba20c94-5658-4c67-b52f-88722cd86e4c-catalog-content\") pod \"redhat-marketplace-qqwq5\" (UID: \"5ba20c94-5658-4c67-b52f-88722cd86e4c\") " pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:07 crc kubenswrapper[4749]: I1001 14:42:07.138830 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ba20c94-5658-4c67-b52f-88722cd86e4c-utilities\") pod \"redhat-marketplace-qqwq5\" (UID: \"5ba20c94-5658-4c67-b52f-88722cd86e4c\") " pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:07 crc kubenswrapper[4749]: I1001 14:42:07.160554 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqg69\" (UniqueName: \"kubernetes.io/projected/5ba20c94-5658-4c67-b52f-88722cd86e4c-kube-api-access-rqg69\") pod \"redhat-marketplace-qqwq5\" (UID: \"5ba20c94-5658-4c67-b52f-88722cd86e4c\") " pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:07 crc kubenswrapper[4749]: I1001 14:42:07.274415 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:07 crc kubenswrapper[4749]: I1001 14:42:07.779427 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqwq5"] Oct 01 14:42:08 crc kubenswrapper[4749]: I1001 14:42:08.499719 4749 generic.go:334] "Generic (PLEG): container finished" podID="5ba20c94-5658-4c67-b52f-88722cd86e4c" containerID="5392547c2e86ec3695161945cb978733df08a03b4b667a176506aaa7f4c66bd9" exitCode=0 Oct 01 14:42:08 crc kubenswrapper[4749]: I1001 14:42:08.499975 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqwq5" event={"ID":"5ba20c94-5658-4c67-b52f-88722cd86e4c","Type":"ContainerDied","Data":"5392547c2e86ec3695161945cb978733df08a03b4b667a176506aaa7f4c66bd9"} Oct 01 14:42:08 crc kubenswrapper[4749]: I1001 14:42:08.499999 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqwq5" event={"ID":"5ba20c94-5658-4c67-b52f-88722cd86e4c","Type":"ContainerStarted","Data":"1da3c7fabecdff1095c96d62b12f451f07f41b50cb2784a2d86373277e5b9e41"} Oct 01 14:42:10 crc kubenswrapper[4749]: I1001 14:42:10.518240 4749 generic.go:334] "Generic (PLEG): container finished" podID="5ba20c94-5658-4c67-b52f-88722cd86e4c" containerID="bfe58dac59b4b245239eb6ddecf406b3eb7e838ed572c0a17ba3fda7d368f9f4" exitCode=0 Oct 01 14:42:10 crc kubenswrapper[4749]: I1001 14:42:10.518359 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqwq5" event={"ID":"5ba20c94-5658-4c67-b52f-88722cd86e4c","Type":"ContainerDied","Data":"bfe58dac59b4b245239eb6ddecf406b3eb7e838ed572c0a17ba3fda7d368f9f4"} Oct 01 14:42:11 crc kubenswrapper[4749]: I1001 14:42:11.235883 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:42:11 crc kubenswrapper[4749]: E1001 14:42:11.236386 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:42:11 crc kubenswrapper[4749]: I1001 14:42:11.529184 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqwq5" event={"ID":"5ba20c94-5658-4c67-b52f-88722cd86e4c","Type":"ContainerStarted","Data":"3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7"} Oct 01 14:42:11 crc kubenswrapper[4749]: I1001 14:42:11.548562 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qqwq5" podStartSLOduration=3.130109544 podStartE2EDuration="5.548545075s" podCreationTimestamp="2025-10-01 14:42:06 +0000 UTC" firstStartedPulling="2025-10-01 14:42:08.508155325 +0000 UTC m=+5788.562140224" lastFinishedPulling="2025-10-01 14:42:10.926590856 +0000 UTC m=+5790.980575755" observedRunningTime="2025-10-01 14:42:11.546558718 +0000 UTC m=+5791.600543657" watchObservedRunningTime="2025-10-01 14:42:11.548545075 +0000 UTC m=+5791.602529984" Oct 01 14:42:12 crc kubenswrapper[4749]: I1001 14:42:12.800541 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-677dbd476-b92fx_0e51f972-9f26-4b6b-8213-9261797a1ee0/barbican-api-log/0.log" Oct 01 14:42:12 crc kubenswrapper[4749]: I1001 14:42:12.803578 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-677dbd476-b92fx_0e51f972-9f26-4b6b-8213-9261797a1ee0/barbican-api/0.log" Oct 01 14:42:13 crc kubenswrapper[4749]: I1001 14:42:13.013085 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5fdf7b5778-vxx8p_a27e333f-57a3-4257-9e49-e03928cfa02d/barbican-keystone-listener/0.log" Oct 01 14:42:13 crc kubenswrapper[4749]: I1001 14:42:13.082730 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5fdf7b5778-vxx8p_a27e333f-57a3-4257-9e49-e03928cfa02d/barbican-keystone-listener-log/0.log" Oct 01 14:42:13 crc kubenswrapper[4749]: I1001 14:42:13.250706 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5c6dfb5585-x78z5_c8b1d3a9-044c-475f-b86f-7e099e2b1197/barbican-worker/0.log" Oct 01 14:42:13 crc kubenswrapper[4749]: I1001 14:42:13.335121 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5c6dfb5585-x78z5_c8b1d3a9-044c-475f-b86f-7e099e2b1197/barbican-worker-log/0.log" Oct 01 14:42:13 crc kubenswrapper[4749]: I1001 14:42:13.479518 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w_36105d3f-3305-4cd8-9b9c-4b3d7eaec504/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:13 crc kubenswrapper[4749]: I1001 14:42:13.781484 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8443de13-c4a9-420c-a4ff-5aa54d222850/ceilometer-notification-agent/0.log" Oct 01 14:42:13 crc kubenswrapper[4749]: I1001 14:42:13.808398 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8443de13-c4a9-420c-a4ff-5aa54d222850/proxy-httpd/0.log" Oct 01 14:42:13 crc kubenswrapper[4749]: I1001 14:42:13.808448 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8443de13-c4a9-420c-a4ff-5aa54d222850/ceilometer-central-agent/0.log" Oct 01 14:42:13 crc kubenswrapper[4749]: I1001 14:42:13.973870 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8443de13-c4a9-420c-a4ff-5aa54d222850/sg-core/0.log" Oct 01 14:42:14 crc kubenswrapper[4749]: I1001 14:42:14.196487 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_468beb78-1358-4a1b-ad2c-3941f3f270c6/cinder-api/0.log" Oct 01 14:42:14 crc kubenswrapper[4749]: I1001 14:42:14.241863 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_468beb78-1358-4a1b-ad2c-3941f3f270c6/cinder-api-log/0.log" Oct 01 14:42:14 crc kubenswrapper[4749]: I1001 14:42:14.391173 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_005bd9d5-4799-4763-aa6b-46a9341c36d2/cinder-scheduler/0.log" Oct 01 14:42:14 crc kubenswrapper[4749]: I1001 14:42:14.569922 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_005bd9d5-4799-4763-aa6b-46a9341c36d2/probe/0.log" Oct 01 14:42:14 crc kubenswrapper[4749]: I1001 14:42:14.671689 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx_f382a017-d5fe-45d9-ad7b-f9316dbd5834/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:14 crc kubenswrapper[4749]: I1001 14:42:14.872505 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t_45b8016b-ecf1-4187-98eb-daf846021c8c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:15 crc kubenswrapper[4749]: I1001 14:42:15.077109 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-q4scf_753c7ac8-1ca7-4787-af3b-87553f59bc9f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:15 crc kubenswrapper[4749]: I1001 14:42:15.167159 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6ff49d554c-jx4l4_1c268c09-ae8f-49b8-916f-b5ce032bfaf1/init/0.log" Oct 01 14:42:15 crc kubenswrapper[4749]: I1001 14:42:15.331560 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6ff49d554c-jx4l4_1c268c09-ae8f-49b8-916f-b5ce032bfaf1/init/0.log" Oct 01 14:42:15 crc kubenswrapper[4749]: I1001 14:42:15.459115 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6ff49d554c-jx4l4_1c268c09-ae8f-49b8-916f-b5ce032bfaf1/dnsmasq-dns/0.log" Oct 01 14:42:15 crc kubenswrapper[4749]: I1001 14:42:15.539262 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn_505c57d6-8e3e-469e-b7ed-c15bdff56519/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:15 crc kubenswrapper[4749]: I1001 14:42:15.660356 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bc78bfd1-472f-4d64-b48c-7b986bee129a/glance-httpd/0.log" Oct 01 14:42:15 crc kubenswrapper[4749]: I1001 14:42:15.715708 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bc78bfd1-472f-4d64-b48c-7b986bee129a/glance-log/0.log" Oct 01 14:42:15 crc kubenswrapper[4749]: I1001 14:42:15.876462 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_495b97d9-1d27-4e6e-a857-ee6cfdf6dffa/glance-log/0.log" Oct 01 14:42:15 crc kubenswrapper[4749]: I1001 14:42:15.892271 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_495b97d9-1d27-4e6e-a857-ee6cfdf6dffa/glance-httpd/0.log" Oct 01 14:42:16 crc kubenswrapper[4749]: I1001 14:42:16.154246 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b74b5b846-r84t7_e22321e2-ded2-4732-ac89-f9f0d4dcd199/horizon/0.log" Oct 01 14:42:16 crc kubenswrapper[4749]: I1001 14:42:16.162347 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8_c51108bd-9132-43bf-ac9b-61a8284dc289/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:16 crc kubenswrapper[4749]: I1001 14:42:16.405401 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mmvsf_9ef2bd67-60d1-4f4b-893c-f7e22430addd/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:16 crc kubenswrapper[4749]: I1001 14:42:16.624655 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b74b5b846-r84t7_e22321e2-ded2-4732-ac89-f9f0d4dcd199/horizon-log/0.log" Oct 01 14:42:16 crc kubenswrapper[4749]: I1001 14:42:16.700669 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29322121-ksm5l_9f1c1f6f-c5a5-499c-874f-245d4d918274/keystone-cron/0.log" Oct 01 14:42:16 crc kubenswrapper[4749]: I1001 14:42:16.934552 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4b21eff6-e2ad-4c02-9558-0346ff822f46/kube-state-metrics/0.log" Oct 01 14:42:17 crc kubenswrapper[4749]: I1001 14:42:17.047328 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-658b64fcb-k2w2c_5b73cc62-6695-480f-90cc-8d1f4b5993b3/keystone-api/0.log" Oct 01 14:42:17 crc kubenswrapper[4749]: I1001 14:42:17.137005 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-bksd6_f2871c6b-b170-4396-8c0b-be0ac02c1b48/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:17 crc kubenswrapper[4749]: I1001 14:42:17.274600 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:17 crc kubenswrapper[4749]: I1001 14:42:17.274720 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:17 crc kubenswrapper[4749]: I1001 14:42:17.330507 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:17 crc kubenswrapper[4749]: I1001 14:42:17.578380 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-59d7c7544c-n7mlp_680ec9d6-ccd3-4417-9919-7412600f23fb/neutron-httpd/0.log" Oct 01 14:42:17 crc kubenswrapper[4749]: I1001 14:42:17.631698 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:17 crc kubenswrapper[4749]: I1001 14:42:17.677762 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqwq5"] Oct 01 14:42:17 crc kubenswrapper[4749]: I1001 14:42:17.714095 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-59d7c7544c-n7mlp_680ec9d6-ccd3-4417-9919-7412600f23fb/neutron-api/0.log" Oct 01 14:42:17 crc kubenswrapper[4749]: I1001 14:42:17.787964 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528_9f01f729-fe3b-4f70-89c9-4398f80160e7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:18 crc kubenswrapper[4749]: I1001 14:42:18.550749 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c/nova-cell0-conductor-conductor/0.log" Oct 01 14:42:19 crc kubenswrapper[4749]: I1001 14:42:19.157058 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_2fe9d22f-227f-4f5a-8c9c-fc50845af518/nova-cell1-conductor-conductor/0.log" Oct 01 14:42:19 crc kubenswrapper[4749]: I1001 14:42:19.533192 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_90086f57-f0d4-4a80-9606-d225410b66e2/nova-api-log/0.log" Oct 01 14:42:19 crc kubenswrapper[4749]: I1001 14:42:19.602755 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qqwq5" podUID="5ba20c94-5658-4c67-b52f-88722cd86e4c" containerName="registry-server" containerID="cri-o://3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7" gracePeriod=2 Oct 01 14:42:19 crc kubenswrapper[4749]: I1001 14:42:19.734540 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b4293a98-bf1d-47e9-9c16-e272e6c836f7/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 14:42:19 crc kubenswrapper[4749]: I1001 14:42:19.879771 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_90086f57-f0d4-4a80-9606-d225410b66e2/nova-api-api/0.log" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.072367 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-n4plg_848e191d-2e82-41af-8368-7c9c7e7b200e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.096584 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.180778 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqg69\" (UniqueName: \"kubernetes.io/projected/5ba20c94-5658-4c67-b52f-88722cd86e4c-kube-api-access-rqg69\") pod \"5ba20c94-5658-4c67-b52f-88722cd86e4c\" (UID: \"5ba20c94-5658-4c67-b52f-88722cd86e4c\") " Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.180989 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ba20c94-5658-4c67-b52f-88722cd86e4c-utilities\") pod \"5ba20c94-5658-4c67-b52f-88722cd86e4c\" (UID: \"5ba20c94-5658-4c67-b52f-88722cd86e4c\") " Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.181018 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ba20c94-5658-4c67-b52f-88722cd86e4c-catalog-content\") pod \"5ba20c94-5658-4c67-b52f-88722cd86e4c\" (UID: \"5ba20c94-5658-4c67-b52f-88722cd86e4c\") " Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.182556 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ba20c94-5658-4c67-b52f-88722cd86e4c-utilities" (OuterVolumeSpecName: "utilities") pod "5ba20c94-5658-4c67-b52f-88722cd86e4c" (UID: "5ba20c94-5658-4c67-b52f-88722cd86e4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.199178 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ba20c94-5658-4c67-b52f-88722cd86e4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ba20c94-5658-4c67-b52f-88722cd86e4c" (UID: "5ba20c94-5658-4c67-b52f-88722cd86e4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.202400 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba20c94-5658-4c67-b52f-88722cd86e4c-kube-api-access-rqg69" (OuterVolumeSpecName: "kube-api-access-rqg69") pod "5ba20c94-5658-4c67-b52f-88722cd86e4c" (UID: "5ba20c94-5658-4c67-b52f-88722cd86e4c"). InnerVolumeSpecName "kube-api-access-rqg69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.236094 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2c53c696-a24d-4024-86dc-2ce22e1a2e8e/nova-metadata-log/0.log" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.284282 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ba20c94-5658-4c67-b52f-88722cd86e4c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.284311 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ba20c94-5658-4c67-b52f-88722cd86e4c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.284323 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqg69\" (UniqueName: \"kubernetes.io/projected/5ba20c94-5658-4c67-b52f-88722cd86e4c-kube-api-access-rqg69\") on node \"crc\" DevicePath \"\"" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.614842 4749 generic.go:334] "Generic (PLEG): container finished" podID="5ba20c94-5658-4c67-b52f-88722cd86e4c" containerID="3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7" exitCode=0 Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.614904 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqwq5" event={"ID":"5ba20c94-5658-4c67-b52f-88722cd86e4c","Type":"ContainerDied","Data":"3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7"} Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.615283 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qqwq5" event={"ID":"5ba20c94-5658-4c67-b52f-88722cd86e4c","Type":"ContainerDied","Data":"1da3c7fabecdff1095c96d62b12f451f07f41b50cb2784a2d86373277e5b9e41"} Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.615310 4749 scope.go:117] "RemoveContainer" containerID="3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.614972 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qqwq5" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.646037 4749 scope.go:117] "RemoveContainer" containerID="bfe58dac59b4b245239eb6ddecf406b3eb7e838ed572c0a17ba3fda7d368f9f4" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.665291 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqwq5"] Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.673567 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qqwq5"] Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.687238 4749 scope.go:117] "RemoveContainer" containerID="5392547c2e86ec3695161945cb978733df08a03b4b667a176506aaa7f4c66bd9" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.723172 4749 scope.go:117] "RemoveContainer" containerID="3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7" Oct 01 14:42:20 crc kubenswrapper[4749]: E1001 14:42:20.724930 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7\": container with ID starting with 3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7 not found: ID does not exist" containerID="3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.724982 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7"} err="failed to get container status \"3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7\": rpc error: code = NotFound desc = could not find container \"3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7\": container with ID starting with 3445d64e7f53bf6689fc1c70f66e3d499ff8ade158dc06cdaa86a8b10bdde3a7 not found: ID does not exist" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.725013 4749 scope.go:117] "RemoveContainer" containerID="bfe58dac59b4b245239eb6ddecf406b3eb7e838ed572c0a17ba3fda7d368f9f4" Oct 01 14:42:20 crc kubenswrapper[4749]: E1001 14:42:20.729365 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe58dac59b4b245239eb6ddecf406b3eb7e838ed572c0a17ba3fda7d368f9f4\": container with ID starting with bfe58dac59b4b245239eb6ddecf406b3eb7e838ed572c0a17ba3fda7d368f9f4 not found: ID does not exist" containerID="bfe58dac59b4b245239eb6ddecf406b3eb7e838ed572c0a17ba3fda7d368f9f4" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.729400 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe58dac59b4b245239eb6ddecf406b3eb7e838ed572c0a17ba3fda7d368f9f4"} err="failed to get container status \"bfe58dac59b4b245239eb6ddecf406b3eb7e838ed572c0a17ba3fda7d368f9f4\": rpc error: code = NotFound desc = could not find container \"bfe58dac59b4b245239eb6ddecf406b3eb7e838ed572c0a17ba3fda7d368f9f4\": container with ID starting with bfe58dac59b4b245239eb6ddecf406b3eb7e838ed572c0a17ba3fda7d368f9f4 not found: ID does not exist" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.729424 4749 scope.go:117] "RemoveContainer" containerID="5392547c2e86ec3695161945cb978733df08a03b4b667a176506aaa7f4c66bd9" Oct 01 14:42:20 crc kubenswrapper[4749]: E1001 14:42:20.731312 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5392547c2e86ec3695161945cb978733df08a03b4b667a176506aaa7f4c66bd9\": container with ID starting with 5392547c2e86ec3695161945cb978733df08a03b4b667a176506aaa7f4c66bd9 not found: ID does not exist" containerID="5392547c2e86ec3695161945cb978733df08a03b4b667a176506aaa7f4c66bd9" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.731354 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5392547c2e86ec3695161945cb978733df08a03b4b667a176506aaa7f4c66bd9"} err="failed to get container status \"5392547c2e86ec3695161945cb978733df08a03b4b667a176506aaa7f4c66bd9\": rpc error: code = NotFound desc = could not find container \"5392547c2e86ec3695161945cb978733df08a03b4b667a176506aaa7f4c66bd9\": container with ID starting with 5392547c2e86ec3695161945cb978733df08a03b4b667a176506aaa7f4c66bd9 not found: ID does not exist" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.802578 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_1097258f-f21a-4b28-935a-d7dea1d508dd/nova-scheduler-scheduler/0.log" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.812231 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b75b73e1-aac4-41c8-9ad7-afe216cf9741/mysql-bootstrap/0.log" Oct 01 14:42:20 crc kubenswrapper[4749]: I1001 14:42:20.987546 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b75b73e1-aac4-41c8-9ad7-afe216cf9741/galera/0.log" Oct 01 14:42:21 crc kubenswrapper[4749]: I1001 14:42:21.016043 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b75b73e1-aac4-41c8-9ad7-afe216cf9741/mysql-bootstrap/0.log" Oct 01 14:42:21 crc kubenswrapper[4749]: I1001 14:42:21.243033 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ba20c94-5658-4c67-b52f-88722cd86e4c" path="/var/lib/kubelet/pods/5ba20c94-5658-4c67-b52f-88722cd86e4c/volumes" Oct 01 14:42:21 crc kubenswrapper[4749]: I1001 14:42:21.256624 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_215f6c11-74c0-4e5e-a39d-8af23dd5e4af/mysql-bootstrap/0.log" Oct 01 14:42:21 crc kubenswrapper[4749]: I1001 14:42:21.497664 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_215f6c11-74c0-4e5e-a39d-8af23dd5e4af/mysql-bootstrap/0.log" Oct 01 14:42:21 crc kubenswrapper[4749]: I1001 14:42:21.508503 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_215f6c11-74c0-4e5e-a39d-8af23dd5e4af/galera/0.log" Oct 01 14:42:21 crc kubenswrapper[4749]: I1001 14:42:21.710914 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b55ebe69-1518-428b-9ceb-383de60316cc/openstackclient/0.log" Oct 01 14:42:21 crc kubenswrapper[4749]: I1001 14:42:21.880118 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jfv7g_2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca/openstack-network-exporter/0.log" Oct 01 14:42:22 crc kubenswrapper[4749]: I1001 14:42:22.134382 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-75t94_45d86d96-b332-498e-a952-c34007c2f07b/ovsdb-server-init/0.log" Oct 01 14:42:22 crc kubenswrapper[4749]: I1001 14:42:22.362814 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-75t94_45d86d96-b332-498e-a952-c34007c2f07b/ovsdb-server-init/0.log" Oct 01 14:42:22 crc kubenswrapper[4749]: I1001 14:42:22.560609 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-75t94_45d86d96-b332-498e-a952-c34007c2f07b/ovsdb-server/0.log" Oct 01 14:42:22 crc kubenswrapper[4749]: I1001 14:42:22.769551 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-75t94_45d86d96-b332-498e-a952-c34007c2f07b/ovs-vswitchd/0.log" Oct 01 14:42:22 crc kubenswrapper[4749]: I1001 14:42:22.793931 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2c53c696-a24d-4024-86dc-2ce22e1a2e8e/nova-metadata-metadata/0.log" Oct 01 14:42:23 crc kubenswrapper[4749]: I1001 14:42:23.020027 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-sl4xv_4035d0d3-eeec-429f-b31e-ab4649ecf92a/ovn-controller/0.log" Oct 01 14:42:23 crc kubenswrapper[4749]: I1001 14:42:23.049548 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-x8tkd_4b848b5a-f3c5-438c-a481-f06d07d4273a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:23 crc kubenswrapper[4749]: I1001 14:42:23.255609 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_78021fea-966d-45d2-8816-265437360e8f/openstack-network-exporter/0.log" Oct 01 14:42:23 crc kubenswrapper[4749]: I1001 14:42:23.313984 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_78021fea-966d-45d2-8816-265437360e8f/ovn-northd/0.log" Oct 01 14:42:23 crc kubenswrapper[4749]: I1001 14:42:23.443267 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eb007c1f-6b53-4c8a-9921-85ccd3d5dad5/openstack-network-exporter/0.log" Oct 01 14:42:23 crc kubenswrapper[4749]: I1001 14:42:23.567234 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eb007c1f-6b53-4c8a-9921-85ccd3d5dad5/ovsdbserver-nb/0.log" Oct 01 14:42:23 crc kubenswrapper[4749]: I1001 14:42:23.775947 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_854288a3-bb59-4721-b1ac-059920cd8c30/openstack-network-exporter/0.log" Oct 01 14:42:23 crc kubenswrapper[4749]: I1001 14:42:23.786909 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_854288a3-bb59-4721-b1ac-059920cd8c30/ovsdbserver-sb/0.log" Oct 01 14:42:24 crc kubenswrapper[4749]: I1001 14:42:24.189277 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-ffbb6dc5b-8kwbn_5d400dce-67f7-4e74-b2b9-85f0302a3e43/placement-api/0.log" Oct 01 14:42:24 crc kubenswrapper[4749]: I1001 14:42:24.271092 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-ffbb6dc5b-8kwbn_5d400dce-67f7-4e74-b2b9-85f0302a3e43/placement-log/0.log" Oct 01 14:42:24 crc kubenswrapper[4749]: I1001 14:42:24.432442 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6877d0fa-8236-4975-af20-88d438464469/init-config-reloader/0.log" Oct 01 14:42:24 crc kubenswrapper[4749]: I1001 14:42:24.559701 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6877d0fa-8236-4975-af20-88d438464469/init-config-reloader/0.log" Oct 01 14:42:24 crc kubenswrapper[4749]: I1001 14:42:24.599447 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6877d0fa-8236-4975-af20-88d438464469/config-reloader/0.log" Oct 01 14:42:24 crc kubenswrapper[4749]: I1001 14:42:24.642280 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6877d0fa-8236-4975-af20-88d438464469/prometheus/0.log" Oct 01 14:42:24 crc kubenswrapper[4749]: I1001 14:42:24.781482 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6877d0fa-8236-4975-af20-88d438464469/thanos-sidecar/0.log" Oct 01 14:42:24 crc kubenswrapper[4749]: I1001 14:42:24.858240 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7/setup-container/0.log" Oct 01 14:42:25 crc kubenswrapper[4749]: I1001 14:42:25.007463 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7/setup-container/0.log" Oct 01 14:42:25 crc kubenswrapper[4749]: I1001 14:42:25.068212 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7/rabbitmq/0.log" Oct 01 14:42:25 crc kubenswrapper[4749]: I1001 14:42:25.223245 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1d655140-d63d-4e40-8de1-875213f37d4a/setup-container/0.log" Oct 01 14:42:25 crc kubenswrapper[4749]: I1001 14:42:25.439763 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1d655140-d63d-4e40-8de1-875213f37d4a/setup-container/0.log" Oct 01 14:42:25 crc kubenswrapper[4749]: I1001 14:42:25.453949 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1d655140-d63d-4e40-8de1-875213f37d4a/rabbitmq/0.log" Oct 01 14:42:25 crc kubenswrapper[4749]: I1001 14:42:25.655754 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5b5aa915-bf5a-4046-834c-6051ed420f42/setup-container/0.log" Oct 01 14:42:25 crc kubenswrapper[4749]: I1001 14:42:25.822720 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5b5aa915-bf5a-4046-834c-6051ed420f42/setup-container/0.log" Oct 01 14:42:25 crc kubenswrapper[4749]: I1001 14:42:25.916622 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5b5aa915-bf5a-4046-834c-6051ed420f42/rabbitmq/0.log" Oct 01 14:42:26 crc kubenswrapper[4749]: I1001 14:42:26.073845 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt_9b4bd5b0-38c2-416c-aba3-9a0522807502/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:26 crc kubenswrapper[4749]: I1001 14:42:26.231119 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:42:26 crc kubenswrapper[4749]: E1001 14:42:26.231399 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:42:26 crc kubenswrapper[4749]: I1001 14:42:26.353667 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zl4wx_ae065bff-2fba-4e8b-a734-75cd8b9d1a26/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:26 crc kubenswrapper[4749]: I1001 14:42:26.525745 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-n659n_ef750054-fd5c-408e-bd33-90e1a43d8a86/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:26 crc kubenswrapper[4749]: I1001 14:42:26.782073 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-s59r6_e51631e2-8bb9-4f43-958a-a3475d800d61/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:26 crc kubenswrapper[4749]: I1001 14:42:26.807711 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-r4gm5_99ed982c-1039-47b4-b8f8-fcc9d06e636d/ssh-known-hosts-edpm-deployment/0.log" Oct 01 14:42:27 crc kubenswrapper[4749]: I1001 14:42:27.016753 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85b88cb8b7-2gzx9_6139ffc4-c70f-45d5-aa79-6fc7b79f2034/proxy-server/0.log" Oct 01 14:42:27 crc kubenswrapper[4749]: I1001 14:42:27.265453 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-h6gm9_c0804583-6f4e-48e5-99f5-eaee2844191d/swift-ring-rebalance/0.log" Oct 01 14:42:27 crc kubenswrapper[4749]: I1001 14:42:27.322382 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85b88cb8b7-2gzx9_6139ffc4-c70f-45d5-aa79-6fc7b79f2034/proxy-httpd/0.log" Oct 01 14:42:27 crc kubenswrapper[4749]: I1001 14:42:27.482798 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/account-auditor/0.log" Oct 01 14:42:27 crc kubenswrapper[4749]: I1001 14:42:27.524374 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/account-reaper/0.log" Oct 01 14:42:27 crc kubenswrapper[4749]: I1001 14:42:27.743443 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/account-server/0.log" Oct 01 14:42:27 crc kubenswrapper[4749]: I1001 14:42:27.744676 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/account-replicator/0.log" Oct 01 14:42:27 crc kubenswrapper[4749]: I1001 14:42:27.750525 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/container-auditor/0.log" Oct 01 14:42:27 crc kubenswrapper[4749]: I1001 14:42:27.971260 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/container-server/0.log" Oct 01 14:42:27 crc kubenswrapper[4749]: I1001 14:42:27.971776 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/container-updater/0.log" Oct 01 14:42:27 crc kubenswrapper[4749]: I1001 14:42:27.984998 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/container-replicator/0.log" Oct 01 14:42:28 crc kubenswrapper[4749]: I1001 14:42:28.184798 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/object-auditor/0.log" Oct 01 14:42:28 crc kubenswrapper[4749]: I1001 14:42:28.217732 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/object-expirer/0.log" Oct 01 14:42:28 crc kubenswrapper[4749]: I1001 14:42:28.261713 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/object-replicator/0.log" Oct 01 14:42:28 crc kubenswrapper[4749]: I1001 14:42:28.402558 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/object-updater/0.log" Oct 01 14:42:28 crc kubenswrapper[4749]: I1001 14:42:28.418015 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/object-server/0.log" Oct 01 14:42:28 crc kubenswrapper[4749]: I1001 14:42:28.482717 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/rsync/0.log" Oct 01 14:42:28 crc kubenswrapper[4749]: I1001 14:42:28.667809 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/swift-recon-cron/0.log" Oct 01 14:42:28 crc kubenswrapper[4749]: I1001 14:42:28.726686 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6_a74a77b0-6409-400a-a75c-115e2b2cba85/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:28 crc kubenswrapper[4749]: I1001 14:42:28.886761 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d/tempest-tests-tempest-tests-runner/0.log" Oct 01 14:42:29 crc kubenswrapper[4749]: I1001 14:42:29.057850 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_1390b811-1714-4e90-9491-e38ba4b0a530/test-operator-logs-container/0.log" Oct 01 14:42:29 crc kubenswrapper[4749]: I1001 14:42:29.247134 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-w88x7_36d63e13-8131-47f4-a65a-a78db593d3bf/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:42:30 crc kubenswrapper[4749]: I1001 14:42:30.331710 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_60860f07-ba03-4dfb-bb91-2bd68232bc90/watcher-applier/0.log" Oct 01 14:42:30 crc kubenswrapper[4749]: I1001 14:42:30.473620 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_696adfa9-0326-4d60-8a2d-c53ee267a249/watcher-api-log/0.log" Oct 01 14:42:30 crc kubenswrapper[4749]: I1001 14:42:30.770119 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_3f0334e5-add1-4ced-bad4-7e77d528e28a/watcher-decision-engine/0.log" Oct 01 14:42:34 crc kubenswrapper[4749]: I1001 14:42:34.080195 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_3f0334e5-add1-4ced-bad4-7e77d528e28a/watcher-decision-engine/1.log" Oct 01 14:42:34 crc kubenswrapper[4749]: I1001 14:42:34.542423 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_696adfa9-0326-4d60-8a2d-c53ee267a249/watcher-api/0.log" Oct 01 14:42:37 crc kubenswrapper[4749]: I1001 14:42:37.236018 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:42:37 crc kubenswrapper[4749]: E1001 14:42:37.236852 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:42:46 crc kubenswrapper[4749]: I1001 14:42:46.693574 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09/memcached/0.log" Oct 01 14:42:50 crc kubenswrapper[4749]: I1001 14:42:50.230691 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:42:50 crc kubenswrapper[4749]: E1001 14:42:50.231664 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:43:04 crc kubenswrapper[4749]: I1001 14:43:04.230329 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:43:04 crc kubenswrapper[4749]: E1001 14:43:04.231058 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:43:08 crc kubenswrapper[4749]: I1001 14:43:08.068176 4749 generic.go:334] "Generic (PLEG): container finished" podID="0c080998-4601-4515-9938-c83aa425429d" containerID="cb3b464281fa1657283409f5cf05e8cfdc8e2623fcbd79af2a764289b7db20b3" exitCode=0 Oct 01 14:43:08 crc kubenswrapper[4749]: I1001 14:43:08.068388 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" event={"ID":"0c080998-4601-4515-9938-c83aa425429d","Type":"ContainerDied","Data":"cb3b464281fa1657283409f5cf05e8cfdc8e2623fcbd79af2a764289b7db20b3"} Oct 01 14:43:09 crc kubenswrapper[4749]: I1001 14:43:09.209162 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" Oct 01 14:43:09 crc kubenswrapper[4749]: I1001 14:43:09.262780 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lqrjw/crc-debug-z2nf9"] Oct 01 14:43:09 crc kubenswrapper[4749]: I1001 14:43:09.269907 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lqrjw/crc-debug-z2nf9"] Oct 01 14:43:09 crc kubenswrapper[4749]: I1001 14:43:09.299306 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c080998-4601-4515-9938-c83aa425429d-host\") pod \"0c080998-4601-4515-9938-c83aa425429d\" (UID: \"0c080998-4601-4515-9938-c83aa425429d\") " Oct 01 14:43:09 crc kubenswrapper[4749]: I1001 14:43:09.299403 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqcwx\" (UniqueName: \"kubernetes.io/projected/0c080998-4601-4515-9938-c83aa425429d-kube-api-access-lqcwx\") pod \"0c080998-4601-4515-9938-c83aa425429d\" (UID: \"0c080998-4601-4515-9938-c83aa425429d\") " Oct 01 14:43:09 crc kubenswrapper[4749]: I1001 14:43:09.299435 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c080998-4601-4515-9938-c83aa425429d-host" (OuterVolumeSpecName: "host") pod "0c080998-4601-4515-9938-c83aa425429d" (UID: "0c080998-4601-4515-9938-c83aa425429d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:43:09 crc kubenswrapper[4749]: I1001 14:43:09.300114 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c080998-4601-4515-9938-c83aa425429d-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:43:09 crc kubenswrapper[4749]: I1001 14:43:09.304340 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c080998-4601-4515-9938-c83aa425429d-kube-api-access-lqcwx" (OuterVolumeSpecName: "kube-api-access-lqcwx") pod "0c080998-4601-4515-9938-c83aa425429d" (UID: "0c080998-4601-4515-9938-c83aa425429d"). InnerVolumeSpecName "kube-api-access-lqcwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:43:09 crc kubenswrapper[4749]: I1001 14:43:09.402075 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqcwx\" (UniqueName: \"kubernetes.io/projected/0c080998-4601-4515-9938-c83aa425429d-kube-api-access-lqcwx\") on node \"crc\" DevicePath \"\"" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.089233 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="836b4a88460d9194d45c0b6357d212f7f75e2cbbec5ec25c0068241d6b53264b" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.089333 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/crc-debug-z2nf9" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.438834 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lqrjw/crc-debug-45bfg"] Oct 01 14:43:10 crc kubenswrapper[4749]: E1001 14:43:10.439430 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba20c94-5658-4c67-b52f-88722cd86e4c" containerName="extract-content" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.439451 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba20c94-5658-4c67-b52f-88722cd86e4c" containerName="extract-content" Oct 01 14:43:10 crc kubenswrapper[4749]: E1001 14:43:10.439469 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba20c94-5658-4c67-b52f-88722cd86e4c" containerName="registry-server" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.439481 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba20c94-5658-4c67-b52f-88722cd86e4c" containerName="registry-server" Oct 01 14:43:10 crc kubenswrapper[4749]: E1001 14:43:10.439503 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba20c94-5658-4c67-b52f-88722cd86e4c" containerName="extract-utilities" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.439516 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba20c94-5658-4c67-b52f-88722cd86e4c" containerName="extract-utilities" Oct 01 14:43:10 crc kubenswrapper[4749]: E1001 14:43:10.439566 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c080998-4601-4515-9938-c83aa425429d" containerName="container-00" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.439578 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c080998-4601-4515-9938-c83aa425429d" containerName="container-00" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.439972 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba20c94-5658-4c67-b52f-88722cd86e4c" containerName="registry-server" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.440002 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c080998-4601-4515-9938-c83aa425429d" containerName="container-00" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.440989 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/crc-debug-45bfg" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.521727 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgzwx\" (UniqueName: \"kubernetes.io/projected/0efb413c-2a7f-4459-9883-5b160eb8392c-kube-api-access-tgzwx\") pod \"crc-debug-45bfg\" (UID: \"0efb413c-2a7f-4459-9883-5b160eb8392c\") " pod="openshift-must-gather-lqrjw/crc-debug-45bfg" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.521821 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0efb413c-2a7f-4459-9883-5b160eb8392c-host\") pod \"crc-debug-45bfg\" (UID: \"0efb413c-2a7f-4459-9883-5b160eb8392c\") " pod="openshift-must-gather-lqrjw/crc-debug-45bfg" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.624159 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0efb413c-2a7f-4459-9883-5b160eb8392c-host\") pod \"crc-debug-45bfg\" (UID: \"0efb413c-2a7f-4459-9883-5b160eb8392c\") " pod="openshift-must-gather-lqrjw/crc-debug-45bfg" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.624304 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0efb413c-2a7f-4459-9883-5b160eb8392c-host\") pod \"crc-debug-45bfg\" (UID: \"0efb413c-2a7f-4459-9883-5b160eb8392c\") " pod="openshift-must-gather-lqrjw/crc-debug-45bfg" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.624326 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgzwx\" (UniqueName: \"kubernetes.io/projected/0efb413c-2a7f-4459-9883-5b160eb8392c-kube-api-access-tgzwx\") pod \"crc-debug-45bfg\" (UID: \"0efb413c-2a7f-4459-9883-5b160eb8392c\") " pod="openshift-must-gather-lqrjw/crc-debug-45bfg" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.654038 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgzwx\" (UniqueName: \"kubernetes.io/projected/0efb413c-2a7f-4459-9883-5b160eb8392c-kube-api-access-tgzwx\") pod \"crc-debug-45bfg\" (UID: \"0efb413c-2a7f-4459-9883-5b160eb8392c\") " pod="openshift-must-gather-lqrjw/crc-debug-45bfg" Oct 01 14:43:10 crc kubenswrapper[4749]: I1001 14:43:10.762815 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/crc-debug-45bfg" Oct 01 14:43:11 crc kubenswrapper[4749]: I1001 14:43:11.098572 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/crc-debug-45bfg" event={"ID":"0efb413c-2a7f-4459-9883-5b160eb8392c","Type":"ContainerStarted","Data":"e6eac4604964ac43b8e09d373b25d7e9ec8ee77689274a9400de0dce872874f9"} Oct 01 14:43:11 crc kubenswrapper[4749]: I1001 14:43:11.098884 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/crc-debug-45bfg" event={"ID":"0efb413c-2a7f-4459-9883-5b160eb8392c","Type":"ContainerStarted","Data":"2f225e43383dd6509413b93ded467d2edda84dba0fa462dc673829d48eb8729f"} Oct 01 14:43:11 crc kubenswrapper[4749]: I1001 14:43:11.118316 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lqrjw/crc-debug-45bfg" podStartSLOduration=1.118293585 podStartE2EDuration="1.118293585s" podCreationTimestamp="2025-10-01 14:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:43:11.112080166 +0000 UTC m=+5851.166065065" watchObservedRunningTime="2025-10-01 14:43:11.118293585 +0000 UTC m=+5851.172278514" Oct 01 14:43:11 crc kubenswrapper[4749]: I1001 14:43:11.243771 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c080998-4601-4515-9938-c83aa425429d" path="/var/lib/kubelet/pods/0c080998-4601-4515-9938-c83aa425429d/volumes" Oct 01 14:43:12 crc kubenswrapper[4749]: I1001 14:43:12.107785 4749 generic.go:334] "Generic (PLEG): container finished" podID="0efb413c-2a7f-4459-9883-5b160eb8392c" containerID="e6eac4604964ac43b8e09d373b25d7e9ec8ee77689274a9400de0dce872874f9" exitCode=0 Oct 01 14:43:12 crc kubenswrapper[4749]: I1001 14:43:12.107857 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/crc-debug-45bfg" event={"ID":"0efb413c-2a7f-4459-9883-5b160eb8392c","Type":"ContainerDied","Data":"e6eac4604964ac43b8e09d373b25d7e9ec8ee77689274a9400de0dce872874f9"} Oct 01 14:43:13 crc kubenswrapper[4749]: I1001 14:43:13.232668 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/crc-debug-45bfg" Oct 01 14:43:13 crc kubenswrapper[4749]: I1001 14:43:13.396069 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgzwx\" (UniqueName: \"kubernetes.io/projected/0efb413c-2a7f-4459-9883-5b160eb8392c-kube-api-access-tgzwx\") pod \"0efb413c-2a7f-4459-9883-5b160eb8392c\" (UID: \"0efb413c-2a7f-4459-9883-5b160eb8392c\") " Oct 01 14:43:13 crc kubenswrapper[4749]: I1001 14:43:13.396148 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0efb413c-2a7f-4459-9883-5b160eb8392c-host\") pod \"0efb413c-2a7f-4459-9883-5b160eb8392c\" (UID: \"0efb413c-2a7f-4459-9883-5b160eb8392c\") " Oct 01 14:43:13 crc kubenswrapper[4749]: I1001 14:43:13.396241 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0efb413c-2a7f-4459-9883-5b160eb8392c-host" (OuterVolumeSpecName: "host") pod "0efb413c-2a7f-4459-9883-5b160eb8392c" (UID: "0efb413c-2a7f-4459-9883-5b160eb8392c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:43:13 crc kubenswrapper[4749]: I1001 14:43:13.396721 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0efb413c-2a7f-4459-9883-5b160eb8392c-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:43:13 crc kubenswrapper[4749]: I1001 14:43:13.401978 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0efb413c-2a7f-4459-9883-5b160eb8392c-kube-api-access-tgzwx" (OuterVolumeSpecName: "kube-api-access-tgzwx") pod "0efb413c-2a7f-4459-9883-5b160eb8392c" (UID: "0efb413c-2a7f-4459-9883-5b160eb8392c"). InnerVolumeSpecName "kube-api-access-tgzwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:43:13 crc kubenswrapper[4749]: I1001 14:43:13.497922 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgzwx\" (UniqueName: \"kubernetes.io/projected/0efb413c-2a7f-4459-9883-5b160eb8392c-kube-api-access-tgzwx\") on node \"crc\" DevicePath \"\"" Oct 01 14:43:14 crc kubenswrapper[4749]: I1001 14:43:14.125386 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/crc-debug-45bfg" event={"ID":"0efb413c-2a7f-4459-9883-5b160eb8392c","Type":"ContainerDied","Data":"2f225e43383dd6509413b93ded467d2edda84dba0fa462dc673829d48eb8729f"} Oct 01 14:43:14 crc kubenswrapper[4749]: I1001 14:43:14.125422 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f225e43383dd6509413b93ded467d2edda84dba0fa462dc673829d48eb8729f" Oct 01 14:43:14 crc kubenswrapper[4749]: I1001 14:43:14.125452 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/crc-debug-45bfg" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.665915 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jvpws"] Oct 01 14:43:16 crc kubenswrapper[4749]: E1001 14:43:16.666915 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0efb413c-2a7f-4459-9883-5b160eb8392c" containerName="container-00" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.666928 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="0efb413c-2a7f-4459-9883-5b160eb8392c" containerName="container-00" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.667112 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="0efb413c-2a7f-4459-9883-5b160eb8392c" containerName="container-00" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.668552 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.684471 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvpws"] Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.743091 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cksw\" (UniqueName: \"kubernetes.io/projected/93d61a6b-d0be-4424-bff1-edcf848d8952-kube-api-access-8cksw\") pod \"redhat-operators-jvpws\" (UID: \"93d61a6b-d0be-4424-bff1-edcf848d8952\") " pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.743174 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d61a6b-d0be-4424-bff1-edcf848d8952-utilities\") pod \"redhat-operators-jvpws\" (UID: \"93d61a6b-d0be-4424-bff1-edcf848d8952\") " pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.743195 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d61a6b-d0be-4424-bff1-edcf848d8952-catalog-content\") pod \"redhat-operators-jvpws\" (UID: \"93d61a6b-d0be-4424-bff1-edcf848d8952\") " pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.844993 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d61a6b-d0be-4424-bff1-edcf848d8952-utilities\") pod \"redhat-operators-jvpws\" (UID: \"93d61a6b-d0be-4424-bff1-edcf848d8952\") " pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.845028 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d61a6b-d0be-4424-bff1-edcf848d8952-catalog-content\") pod \"redhat-operators-jvpws\" (UID: \"93d61a6b-d0be-4424-bff1-edcf848d8952\") " pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.845190 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cksw\" (UniqueName: \"kubernetes.io/projected/93d61a6b-d0be-4424-bff1-edcf848d8952-kube-api-access-8cksw\") pod \"redhat-operators-jvpws\" (UID: \"93d61a6b-d0be-4424-bff1-edcf848d8952\") " pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.846288 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d61a6b-d0be-4424-bff1-edcf848d8952-utilities\") pod \"redhat-operators-jvpws\" (UID: \"93d61a6b-d0be-4424-bff1-edcf848d8952\") " pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.847405 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d61a6b-d0be-4424-bff1-edcf848d8952-catalog-content\") pod \"redhat-operators-jvpws\" (UID: \"93d61a6b-d0be-4424-bff1-edcf848d8952\") " pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.873129 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cksw\" (UniqueName: \"kubernetes.io/projected/93d61a6b-d0be-4424-bff1-edcf848d8952-kube-api-access-8cksw\") pod \"redhat-operators-jvpws\" (UID: \"93d61a6b-d0be-4424-bff1-edcf848d8952\") " pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:16 crc kubenswrapper[4749]: I1001 14:43:16.993483 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:17 crc kubenswrapper[4749]: I1001 14:43:17.229668 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:43:17 crc kubenswrapper[4749]: E1001 14:43:17.229902 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:43:17 crc kubenswrapper[4749]: I1001 14:43:17.425887 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvpws"] Oct 01 14:43:18 crc kubenswrapper[4749]: I1001 14:43:18.170395 4749 generic.go:334] "Generic (PLEG): container finished" podID="93d61a6b-d0be-4424-bff1-edcf848d8952" containerID="f35f2cfedcc101538bb3e20315f63a3b83776b37f38834dd083ed1abcc623448" exitCode=0 Oct 01 14:43:18 crc kubenswrapper[4749]: I1001 14:43:18.170499 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvpws" event={"ID":"93d61a6b-d0be-4424-bff1-edcf848d8952","Type":"ContainerDied","Data":"f35f2cfedcc101538bb3e20315f63a3b83776b37f38834dd083ed1abcc623448"} Oct 01 14:43:18 crc kubenswrapper[4749]: I1001 14:43:18.170708 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvpws" event={"ID":"93d61a6b-d0be-4424-bff1-edcf848d8952","Type":"ContainerStarted","Data":"e01319b1ca45b1a10c8510e134a6b5ad27a7ed96d5f6f8d87a243fe30c7acace"} Oct 01 14:43:19 crc kubenswrapper[4749]: I1001 14:43:19.533587 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lqrjw/crc-debug-45bfg"] Oct 01 14:43:19 crc kubenswrapper[4749]: I1001 14:43:19.541543 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lqrjw/crc-debug-45bfg"] Oct 01 14:43:20 crc kubenswrapper[4749]: I1001 14:43:20.197630 4749 generic.go:334] "Generic (PLEG): container finished" podID="93d61a6b-d0be-4424-bff1-edcf848d8952" containerID="c63840fc79fe328aaabb9e08bfa9b4b89da1c0bf2c12eb6370557300fbf802d2" exitCode=0 Oct 01 14:43:20 crc kubenswrapper[4749]: I1001 14:43:20.197678 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvpws" event={"ID":"93d61a6b-d0be-4424-bff1-edcf848d8952","Type":"ContainerDied","Data":"c63840fc79fe328aaabb9e08bfa9b4b89da1c0bf2c12eb6370557300fbf802d2"} Oct 01 14:43:20 crc kubenswrapper[4749]: I1001 14:43:20.756521 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lqrjw/crc-debug-d4jlz"] Oct 01 14:43:20 crc kubenswrapper[4749]: I1001 14:43:20.758116 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/crc-debug-d4jlz" Oct 01 14:43:20 crc kubenswrapper[4749]: I1001 14:43:20.923638 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4c118c8-655c-4c98-9846-bc80a520cdb4-host\") pod \"crc-debug-d4jlz\" (UID: \"b4c118c8-655c-4c98-9846-bc80a520cdb4\") " pod="openshift-must-gather-lqrjw/crc-debug-d4jlz" Oct 01 14:43:20 crc kubenswrapper[4749]: I1001 14:43:20.923936 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r74pt\" (UniqueName: \"kubernetes.io/projected/b4c118c8-655c-4c98-9846-bc80a520cdb4-kube-api-access-r74pt\") pod \"crc-debug-d4jlz\" (UID: \"b4c118c8-655c-4c98-9846-bc80a520cdb4\") " pod="openshift-must-gather-lqrjw/crc-debug-d4jlz" Oct 01 14:43:21 crc kubenswrapper[4749]: I1001 14:43:21.025949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4c118c8-655c-4c98-9846-bc80a520cdb4-host\") pod \"crc-debug-d4jlz\" (UID: \"b4c118c8-655c-4c98-9846-bc80a520cdb4\") " pod="openshift-must-gather-lqrjw/crc-debug-d4jlz" Oct 01 14:43:21 crc kubenswrapper[4749]: I1001 14:43:21.026003 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r74pt\" (UniqueName: \"kubernetes.io/projected/b4c118c8-655c-4c98-9846-bc80a520cdb4-kube-api-access-r74pt\") pod \"crc-debug-d4jlz\" (UID: \"b4c118c8-655c-4c98-9846-bc80a520cdb4\") " pod="openshift-must-gather-lqrjw/crc-debug-d4jlz" Oct 01 14:43:21 crc kubenswrapper[4749]: I1001 14:43:21.026095 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4c118c8-655c-4c98-9846-bc80a520cdb4-host\") pod \"crc-debug-d4jlz\" (UID: \"b4c118c8-655c-4c98-9846-bc80a520cdb4\") " pod="openshift-must-gather-lqrjw/crc-debug-d4jlz" Oct 01 14:43:21 crc kubenswrapper[4749]: I1001 14:43:21.058239 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r74pt\" (UniqueName: \"kubernetes.io/projected/b4c118c8-655c-4c98-9846-bc80a520cdb4-kube-api-access-r74pt\") pod \"crc-debug-d4jlz\" (UID: \"b4c118c8-655c-4c98-9846-bc80a520cdb4\") " pod="openshift-must-gather-lqrjw/crc-debug-d4jlz" Oct 01 14:43:21 crc kubenswrapper[4749]: I1001 14:43:21.108962 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/crc-debug-d4jlz" Oct 01 14:43:21 crc kubenswrapper[4749]: W1001 14:43:21.140266 4749 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4c118c8_655c_4c98_9846_bc80a520cdb4.slice/crio-b54181986dd4fee9be19a199b6c9630c19d6fd3a1658238cfd0ee9f6dc97da42 WatchSource:0}: Error finding container b54181986dd4fee9be19a199b6c9630c19d6fd3a1658238cfd0ee9f6dc97da42: Status 404 returned error can't find the container with id b54181986dd4fee9be19a199b6c9630c19d6fd3a1658238cfd0ee9f6dc97da42 Oct 01 14:43:21 crc kubenswrapper[4749]: I1001 14:43:21.211704 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/crc-debug-d4jlz" event={"ID":"b4c118c8-655c-4c98-9846-bc80a520cdb4","Type":"ContainerStarted","Data":"b54181986dd4fee9be19a199b6c9630c19d6fd3a1658238cfd0ee9f6dc97da42"} Oct 01 14:43:21 crc kubenswrapper[4749]: I1001 14:43:21.215070 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvpws" event={"ID":"93d61a6b-d0be-4424-bff1-edcf848d8952","Type":"ContainerStarted","Data":"a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b"} Oct 01 14:43:21 crc kubenswrapper[4749]: I1001 14:43:21.246239 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jvpws" podStartSLOduration=2.7493037080000002 podStartE2EDuration="5.246205663s" podCreationTimestamp="2025-10-01 14:43:16 +0000 UTC" firstStartedPulling="2025-10-01 14:43:18.172924743 +0000 UTC m=+5858.226909642" lastFinishedPulling="2025-10-01 14:43:20.669826698 +0000 UTC m=+5860.723811597" observedRunningTime="2025-10-01 14:43:21.238389658 +0000 UTC m=+5861.292374567" watchObservedRunningTime="2025-10-01 14:43:21.246205663 +0000 UTC m=+5861.300190562" Oct 01 14:43:21 crc kubenswrapper[4749]: I1001 14:43:21.257437 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0efb413c-2a7f-4459-9883-5b160eb8392c" path="/var/lib/kubelet/pods/0efb413c-2a7f-4459-9883-5b160eb8392c/volumes" Oct 01 14:43:22 crc kubenswrapper[4749]: I1001 14:43:22.235577 4749 generic.go:334] "Generic (PLEG): container finished" podID="b4c118c8-655c-4c98-9846-bc80a520cdb4" containerID="e5aa8690f875897b45e87d447994033d05436eda29c1405bef5d48f632cd247b" exitCode=0 Oct 01 14:43:22 crc kubenswrapper[4749]: I1001 14:43:22.235703 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/crc-debug-d4jlz" event={"ID":"b4c118c8-655c-4c98-9846-bc80a520cdb4","Type":"ContainerDied","Data":"e5aa8690f875897b45e87d447994033d05436eda29c1405bef5d48f632cd247b"} Oct 01 14:43:22 crc kubenswrapper[4749]: I1001 14:43:22.281644 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lqrjw/crc-debug-d4jlz"] Oct 01 14:43:22 crc kubenswrapper[4749]: I1001 14:43:22.292389 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lqrjw/crc-debug-d4jlz"] Oct 01 14:43:23 crc kubenswrapper[4749]: I1001 14:43:23.338205 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/crc-debug-d4jlz" Oct 01 14:43:23 crc kubenswrapper[4749]: I1001 14:43:23.477304 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r74pt\" (UniqueName: \"kubernetes.io/projected/b4c118c8-655c-4c98-9846-bc80a520cdb4-kube-api-access-r74pt\") pod \"b4c118c8-655c-4c98-9846-bc80a520cdb4\" (UID: \"b4c118c8-655c-4c98-9846-bc80a520cdb4\") " Oct 01 14:43:23 crc kubenswrapper[4749]: I1001 14:43:23.477413 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4c118c8-655c-4c98-9846-bc80a520cdb4-host\") pod \"b4c118c8-655c-4c98-9846-bc80a520cdb4\" (UID: \"b4c118c8-655c-4c98-9846-bc80a520cdb4\") " Oct 01 14:43:23 crc kubenswrapper[4749]: I1001 14:43:23.477512 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4c118c8-655c-4c98-9846-bc80a520cdb4-host" (OuterVolumeSpecName: "host") pod "b4c118c8-655c-4c98-9846-bc80a520cdb4" (UID: "b4c118c8-655c-4c98-9846-bc80a520cdb4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:43:23 crc kubenswrapper[4749]: I1001 14:43:23.477920 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4c118c8-655c-4c98-9846-bc80a520cdb4-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:43:23 crc kubenswrapper[4749]: I1001 14:43:23.482914 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4c118c8-655c-4c98-9846-bc80a520cdb4-kube-api-access-r74pt" (OuterVolumeSpecName: "kube-api-access-r74pt") pod "b4c118c8-655c-4c98-9846-bc80a520cdb4" (UID: "b4c118c8-655c-4c98-9846-bc80a520cdb4"). InnerVolumeSpecName "kube-api-access-r74pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:43:23 crc kubenswrapper[4749]: I1001 14:43:23.580319 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r74pt\" (UniqueName: \"kubernetes.io/projected/b4c118c8-655c-4c98-9846-bc80a520cdb4-kube-api-access-r74pt\") on node \"crc\" DevicePath \"\"" Oct 01 14:43:24 crc kubenswrapper[4749]: I1001 14:43:24.104910 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-vvxcs_99a6dbcc-0b05-4471-b2d4-acacf72f6ff0/kube-rbac-proxy/0.log" Oct 01 14:43:24 crc kubenswrapper[4749]: I1001 14:43:24.256555 4749 scope.go:117] "RemoveContainer" containerID="e5aa8690f875897b45e87d447994033d05436eda29c1405bef5d48f632cd247b" Oct 01 14:43:24 crc kubenswrapper[4749]: I1001 14:43:24.256608 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/crc-debug-d4jlz" Oct 01 14:43:24 crc kubenswrapper[4749]: I1001 14:43:24.408656 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-vvxcs_99a6dbcc-0b05-4471-b2d4-acacf72f6ff0/manager/0.log" Oct 01 14:43:24 crc kubenswrapper[4749]: I1001 14:43:24.582306 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-58njn_4a7c1ef4-c125-445b-9f1e-b24ee27e2938/kube-rbac-proxy/0.log" Oct 01 14:43:24 crc kubenswrapper[4749]: I1001 14:43:24.702099 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-58njn_4a7c1ef4-c125-445b-9f1e-b24ee27e2938/manager/0.log" Oct 01 14:43:24 crc kubenswrapper[4749]: I1001 14:43:24.750189 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-ck4dv_0418f548-554a-4efc-8494-4edc9d56fc7f/kube-rbac-proxy/0.log" Oct 01 14:43:24 crc kubenswrapper[4749]: I1001 14:43:24.767938 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-ck4dv_0418f548-554a-4efc-8494-4edc9d56fc7f/manager/0.log" Oct 01 14:43:24 crc kubenswrapper[4749]: I1001 14:43:24.882231 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/util/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.072049 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/pull/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.111069 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/util/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.151750 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/pull/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.241550 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4c118c8-655c-4c98-9846-bc80a520cdb4" path="/var/lib/kubelet/pods/b4c118c8-655c-4c98-9846-bc80a520cdb4/volumes" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.279548 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/util/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.336652 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/extract/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.353184 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/pull/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.465562 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-6w5m5_6647f74c-8bbb-490d-ade7-8b2fb5469ddc/kube-rbac-proxy/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.564620 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-6w5m5_6647f74c-8bbb-490d-ade7-8b2fb5469ddc/manager/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.639716 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-h22cx_11409643-1cee-49c5-b3d0-fa1ec4cb1af0/kube-rbac-proxy/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.661084 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-h22cx_11409643-1cee-49c5-b3d0-fa1ec4cb1af0/manager/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.770171 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-4gdqf_02d25e12-ca12-40f3-bc21-2b5a55fdba5d/kube-rbac-proxy/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.910191 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-4gdqf_02d25e12-ca12-40f3-bc21-2b5a55fdba5d/manager/0.log" Oct 01 14:43:25 crc kubenswrapper[4749]: I1001 14:43:25.935326 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-xk2zl_8f20ab81-68d3-4973-9336-d00440b811f9/kube-rbac-proxy/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.121687 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-bhn8n_40a29698-620f-45d7-b630-0cfe188dd09f/kube-rbac-proxy/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.148162 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-xk2zl_8f20ab81-68d3-4973-9336-d00440b811f9/manager/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.183874 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-bhn8n_40a29698-620f-45d7-b630-0cfe188dd09f/manager/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.298060 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-292hc_00534b7e-41c7-4935-8349-78aee327867e/kube-rbac-proxy/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.395825 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-292hc_00534b7e-41c7-4935-8349-78aee327867e/manager/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.481551 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-8z228_6a63cd00-64f1-42cf-8250-abc3dfc3a4ff/kube-rbac-proxy/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.514955 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-8z228_6a63cd00-64f1-42cf-8250-abc3dfc3a4ff/manager/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.590759 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-8898x_3bb59228-30b8-42af-b24c-dc50224fde04/kube-rbac-proxy/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.686009 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-8898x_3bb59228-30b8-42af-b24c-dc50224fde04/manager/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.706961 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-j4s7v_a26beb61-7189-40d0-9284-e58654887bbd/kube-rbac-proxy/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.817337 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-j4s7v_a26beb61-7189-40d0-9284-e58654887bbd/manager/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.940761 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-jqvd5_bf6d3a96-3f74-44df-8e75-1865612d0303/kube-rbac-proxy/0.log" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.994000 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:26 crc kubenswrapper[4749]: I1001 14:43:26.994433 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.028826 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-jqvd5_bf6d3a96-3f74-44df-8e75-1865612d0303/manager/0.log" Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.052606 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.124264 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-cr9ww_ab9def29-b23e-4af8-828b-3c4151503a96/kube-rbac-proxy/0.log" Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.131367 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-cr9ww_ab9def29-b23e-4af8-828b-3c4151503a96/manager/0.log" Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.247441 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cg66wx_557ca73b-a87b-4d42-8d86-dfbd057ae1fd/kube-rbac-proxy/0.log" Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.296153 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cg66wx_557ca73b-a87b-4d42-8d86-dfbd057ae1fd/manager/0.log" Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.336198 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.389319 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvpws"] Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.480049 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5fc59ccd99-c2sc8_9aac26fb-f511-4491-a239-7c5f7ced5f43/kube-rbac-proxy/0.log" Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.529010 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7445ccf7db-stmwq_b0603207-94a4-47e5-aff1-2572d337f429/kube-rbac-proxy/0.log" Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.815592 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-dmmcd_855affad-2b74-41a1-89c8-d6eba2072bb7/registry-server/0.log" Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.870959 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7445ccf7db-stmwq_b0603207-94a4-47e5-aff1-2572d337f429/operator/0.log" Oct 01 14:43:27 crc kubenswrapper[4749]: I1001 14:43:27.966601 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-cwjsk_f7c97980-24c5-42e5-b60c-763bd31ad269/kube-rbac-proxy/0.log" Oct 01 14:43:28 crc kubenswrapper[4749]: I1001 14:43:28.078014 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-tzpfm_b544aac2-b3f7-453e-a05b-58f22b1b4fe1/kube-rbac-proxy/0.log" Oct 01 14:43:28 crc kubenswrapper[4749]: I1001 14:43:28.134322 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-cwjsk_f7c97980-24c5-42e5-b60c-763bd31ad269/manager/0.log" Oct 01 14:43:28 crc kubenswrapper[4749]: I1001 14:43:28.277764 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-tzpfm_b544aac2-b3f7-453e-a05b-58f22b1b4fe1/manager/0.log" Oct 01 14:43:28 crc kubenswrapper[4749]: I1001 14:43:28.406144 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-x26nl_33e04d57-8c80-4ed0-b05b-1edf290d476e/operator/0.log" Oct 01 14:43:28 crc kubenswrapper[4749]: I1001 14:43:28.596697 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-9s6hz_765c8918-abbc-47dd-8960-18292d54a9a0/kube-rbac-proxy/0.log" Oct 01 14:43:28 crc kubenswrapper[4749]: I1001 14:43:28.614999 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-5qtjp_c81045dc-62f0-4ae6-9e05-e26cc8f90611/manager/0.log" Oct 01 14:43:28 crc kubenswrapper[4749]: I1001 14:43:28.627696 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-5qtjp_c81045dc-62f0-4ae6-9e05-e26cc8f90611/kube-rbac-proxy/0.log" Oct 01 14:43:28 crc kubenswrapper[4749]: I1001 14:43:28.863622 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5fc59ccd99-c2sc8_9aac26fb-f511-4491-a239-7c5f7ced5f43/manager/0.log" Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.047888 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-bdz5p_acf0c68b-d009-4f2f-a21e-a2573842a063/kube-rbac-proxy/0.log" Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.143272 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-9s6hz_765c8918-abbc-47dd-8960-18292d54a9a0/manager/0.log" Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.156087 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-bdz5p_acf0c68b-d009-4f2f-a21e-a2573842a063/manager/0.log" Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.253505 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-56f5865b8b-4g6n9_1d38e134-a674-403f-9a4b-4ee8de1fe763/kube-rbac-proxy/0.log" Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.300507 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jvpws" podUID="93d61a6b-d0be-4424-bff1-edcf848d8952" containerName="registry-server" containerID="cri-o://a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b" gracePeriod=2 Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.307754 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-56f5865b8b-4g6n9_1d38e134-a674-403f-9a4b-4ee8de1fe763/manager/0.log" Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.780471 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.899291 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d61a6b-d0be-4424-bff1-edcf848d8952-catalog-content\") pod \"93d61a6b-d0be-4424-bff1-edcf848d8952\" (UID: \"93d61a6b-d0be-4424-bff1-edcf848d8952\") " Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.899469 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d61a6b-d0be-4424-bff1-edcf848d8952-utilities\") pod \"93d61a6b-d0be-4424-bff1-edcf848d8952\" (UID: \"93d61a6b-d0be-4424-bff1-edcf848d8952\") " Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.899509 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cksw\" (UniqueName: \"kubernetes.io/projected/93d61a6b-d0be-4424-bff1-edcf848d8952-kube-api-access-8cksw\") pod \"93d61a6b-d0be-4424-bff1-edcf848d8952\" (UID: \"93d61a6b-d0be-4424-bff1-edcf848d8952\") " Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.900065 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93d61a6b-d0be-4424-bff1-edcf848d8952-utilities" (OuterVolumeSpecName: "utilities") pod "93d61a6b-d0be-4424-bff1-edcf848d8952" (UID: "93d61a6b-d0be-4424-bff1-edcf848d8952"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.904867 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d61a6b-d0be-4424-bff1-edcf848d8952-kube-api-access-8cksw" (OuterVolumeSpecName: "kube-api-access-8cksw") pod "93d61a6b-d0be-4424-bff1-edcf848d8952" (UID: "93d61a6b-d0be-4424-bff1-edcf848d8952"). InnerVolumeSpecName "kube-api-access-8cksw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:43:29 crc kubenswrapper[4749]: I1001 14:43:29.981069 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93d61a6b-d0be-4424-bff1-edcf848d8952-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93d61a6b-d0be-4424-bff1-edcf848d8952" (UID: "93d61a6b-d0be-4424-bff1-edcf848d8952"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.001636 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d61a6b-d0be-4424-bff1-edcf848d8952-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.001672 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cksw\" (UniqueName: \"kubernetes.io/projected/93d61a6b-d0be-4424-bff1-edcf848d8952-kube-api-access-8cksw\") on node \"crc\" DevicePath \"\"" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.001684 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d61a6b-d0be-4424-bff1-edcf848d8952-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.229691 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:43:30 crc kubenswrapper[4749]: E1001 14:43:30.230072 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.311693 4749 generic.go:334] "Generic (PLEG): container finished" podID="93d61a6b-d0be-4424-bff1-edcf848d8952" containerID="a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b" exitCode=0 Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.311755 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvpws" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.311784 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvpws" event={"ID":"93d61a6b-d0be-4424-bff1-edcf848d8952","Type":"ContainerDied","Data":"a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b"} Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.312053 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvpws" event={"ID":"93d61a6b-d0be-4424-bff1-edcf848d8952","Type":"ContainerDied","Data":"e01319b1ca45b1a10c8510e134a6b5ad27a7ed96d5f6f8d87a243fe30c7acace"} Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.312070 4749 scope.go:117] "RemoveContainer" containerID="a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.332181 4749 scope.go:117] "RemoveContainer" containerID="c63840fc79fe328aaabb9e08bfa9b4b89da1c0bf2c12eb6370557300fbf802d2" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.375964 4749 scope.go:117] "RemoveContainer" containerID="f35f2cfedcc101538bb3e20315f63a3b83776b37f38834dd083ed1abcc623448" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.379696 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvpws"] Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.391914 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jvpws"] Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.407779 4749 scope.go:117] "RemoveContainer" containerID="a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b" Oct 01 14:43:30 crc kubenswrapper[4749]: E1001 14:43:30.408232 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b\": container with ID starting with a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b not found: ID does not exist" containerID="a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.408262 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b"} err="failed to get container status \"a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b\": rpc error: code = NotFound desc = could not find container \"a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b\": container with ID starting with a64d529d6b05fbccf326d1a97626185c5b444f96083fabd2ce57179655aabb1b not found: ID does not exist" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.408281 4749 scope.go:117] "RemoveContainer" containerID="c63840fc79fe328aaabb9e08bfa9b4b89da1c0bf2c12eb6370557300fbf802d2" Oct 01 14:43:30 crc kubenswrapper[4749]: E1001 14:43:30.408568 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63840fc79fe328aaabb9e08bfa9b4b89da1c0bf2c12eb6370557300fbf802d2\": container with ID starting with c63840fc79fe328aaabb9e08bfa9b4b89da1c0bf2c12eb6370557300fbf802d2 not found: ID does not exist" containerID="c63840fc79fe328aaabb9e08bfa9b4b89da1c0bf2c12eb6370557300fbf802d2" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.408597 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63840fc79fe328aaabb9e08bfa9b4b89da1c0bf2c12eb6370557300fbf802d2"} err="failed to get container status \"c63840fc79fe328aaabb9e08bfa9b4b89da1c0bf2c12eb6370557300fbf802d2\": rpc error: code = NotFound desc = could not find container \"c63840fc79fe328aaabb9e08bfa9b4b89da1c0bf2c12eb6370557300fbf802d2\": container with ID starting with c63840fc79fe328aaabb9e08bfa9b4b89da1c0bf2c12eb6370557300fbf802d2 not found: ID does not exist" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.408609 4749 scope.go:117] "RemoveContainer" containerID="f35f2cfedcc101538bb3e20315f63a3b83776b37f38834dd083ed1abcc623448" Oct 01 14:43:30 crc kubenswrapper[4749]: E1001 14:43:30.408878 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f35f2cfedcc101538bb3e20315f63a3b83776b37f38834dd083ed1abcc623448\": container with ID starting with f35f2cfedcc101538bb3e20315f63a3b83776b37f38834dd083ed1abcc623448 not found: ID does not exist" containerID="f35f2cfedcc101538bb3e20315f63a3b83776b37f38834dd083ed1abcc623448" Oct 01 14:43:30 crc kubenswrapper[4749]: I1001 14:43:30.408939 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f35f2cfedcc101538bb3e20315f63a3b83776b37f38834dd083ed1abcc623448"} err="failed to get container status \"f35f2cfedcc101538bb3e20315f63a3b83776b37f38834dd083ed1abcc623448\": rpc error: code = NotFound desc = could not find container \"f35f2cfedcc101538bb3e20315f63a3b83776b37f38834dd083ed1abcc623448\": container with ID starting with f35f2cfedcc101538bb3e20315f63a3b83776b37f38834dd083ed1abcc623448 not found: ID does not exist" Oct 01 14:43:31 crc kubenswrapper[4749]: I1001 14:43:31.240368 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d61a6b-d0be-4424-bff1-edcf848d8952" path="/var/lib/kubelet/pods/93d61a6b-d0be-4424-bff1-edcf848d8952/volumes" Oct 01 14:43:44 crc kubenswrapper[4749]: I1001 14:43:44.230909 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:43:44 crc kubenswrapper[4749]: E1001 14:43:44.231568 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:43:45 crc kubenswrapper[4749]: I1001 14:43:45.292873 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8twps_31175322-99a0-4224-82d9-ca63e5a241c8/control-plane-machine-set-operator/0.log" Oct 01 14:43:45 crc kubenswrapper[4749]: I1001 14:43:45.395539 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nwwtl_78eab148-c7db-4caf-99dd-7576fdee2366/kube-rbac-proxy/0.log" Oct 01 14:43:45 crc kubenswrapper[4749]: I1001 14:43:45.446114 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nwwtl_78eab148-c7db-4caf-99dd-7576fdee2366/machine-api-operator/0.log" Oct 01 14:43:57 crc kubenswrapper[4749]: I1001 14:43:57.229670 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:43:57 crc kubenswrapper[4749]: E1001 14:43:57.230403 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:43:57 crc kubenswrapper[4749]: I1001 14:43:57.249370 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4mvl7_851207b0-7920-45e6-b27b-aeda659789b7/cert-manager-controller/0.log" Oct 01 14:43:57 crc kubenswrapper[4749]: I1001 14:43:57.393083 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-8k42j_58b791af-6670-4e2f-8cd7-a55793e8d9ba/cert-manager-cainjector/0.log" Oct 01 14:43:57 crc kubenswrapper[4749]: I1001 14:43:57.427713 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-dbrwz_ab88e6b2-d588-4c1d-8946-89b5fe7c47f1/cert-manager-webhook/0.log" Oct 01 14:44:09 crc kubenswrapper[4749]: I1001 14:44:09.146981 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-64bz9_cf5a15f7-d043-4b90-828f-584b833d38e5/nmstate-console-plugin/0.log" Oct 01 14:44:09 crc kubenswrapper[4749]: I1001 14:44:09.264128 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vd27x_19901e16-c93e-4806-a467-7af1e9ad9405/nmstate-handler/0.log" Oct 01 14:44:09 crc kubenswrapper[4749]: I1001 14:44:09.323634 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-5nhb7_48ae603d-54fb-4c62-8c02-1e9d6034ca81/kube-rbac-proxy/0.log" Oct 01 14:44:09 crc kubenswrapper[4749]: I1001 14:44:09.347903 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-5nhb7_48ae603d-54fb-4c62-8c02-1e9d6034ca81/nmstate-metrics/0.log" Oct 01 14:44:09 crc kubenswrapper[4749]: I1001 14:44:09.473112 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-c7stb_a4382132-aa77-4918-8533-ea2d0cf18eba/nmstate-operator/0.log" Oct 01 14:44:09 crc kubenswrapper[4749]: I1001 14:44:09.570770 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-tk6l4_2237dcc7-ae68-4298-87d0-44d81d96b3c5/nmstate-webhook/0.log" Oct 01 14:44:11 crc kubenswrapper[4749]: I1001 14:44:11.237613 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:44:11 crc kubenswrapper[4749]: E1001 14:44:11.238088 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:44:22 crc kubenswrapper[4749]: I1001 14:44:22.948832 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-zrnd6_286d5dec-6b31-4235-ae91-705174a2aa4e/kube-rbac-proxy/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.162103 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-zrnd6_286d5dec-6b31-4235-ae91-705174a2aa4e/controller/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.172939 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-frr-files/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.298028 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-frr-files/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.319902 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-reloader/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.354298 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-reloader/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.367422 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-metrics/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.526941 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-reloader/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.533583 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-metrics/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.537203 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-frr-files/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.573954 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-metrics/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.725647 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-frr-files/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.761681 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/controller/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.773596 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-reloader/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.786950 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-metrics/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.924050 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/frr-metrics/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.966851 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/kube-rbac-proxy/0.log" Oct 01 14:44:23 crc kubenswrapper[4749]: I1001 14:44:23.981825 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/kube-rbac-proxy-frr/0.log" Oct 01 14:44:24 crc kubenswrapper[4749]: I1001 14:44:24.117250 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/reloader/0.log" Oct 01 14:44:24 crc kubenswrapper[4749]: I1001 14:44:24.229387 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:44:24 crc kubenswrapper[4749]: I1001 14:44:24.229704 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-g9dzs_a398c955-5f6f-4519-8ad2-77d151718daf/frr-k8s-webhook-server/0.log" Oct 01 14:44:24 crc kubenswrapper[4749]: E1001 14:44:24.229774 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:44:24 crc kubenswrapper[4749]: I1001 14:44:24.410472 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-87f8f4bcc-p49h8_3cf2530f-bd63-401b-992b-51f01a86598c/manager/0.log" Oct 01 14:44:24 crc kubenswrapper[4749]: I1001 14:44:24.602165 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b54b59d49-p8dqv_c2aa45a1-115c-47cc-9b5f-d1a79549a3a8/webhook-server/0.log" Oct 01 14:44:24 crc kubenswrapper[4749]: I1001 14:44:24.664157 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nx44s_cf11208c-e4d3-4873-872a-9b6b168ff648/kube-rbac-proxy/0.log" Oct 01 14:44:25 crc kubenswrapper[4749]: I1001 14:44:25.361038 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nx44s_cf11208c-e4d3-4873-872a-9b6b168ff648/speaker/0.log" Oct 01 14:44:25 crc kubenswrapper[4749]: I1001 14:44:25.648793 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/frr/0.log" Oct 01 14:44:37 crc kubenswrapper[4749]: I1001 14:44:37.391103 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/util/0.log" Oct 01 14:44:37 crc kubenswrapper[4749]: I1001 14:44:37.626047 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/pull/0.log" Oct 01 14:44:37 crc kubenswrapper[4749]: I1001 14:44:37.638576 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/util/0.log" Oct 01 14:44:37 crc kubenswrapper[4749]: I1001 14:44:37.675924 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/pull/0.log" Oct 01 14:44:37 crc kubenswrapper[4749]: I1001 14:44:37.862814 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/pull/0.log" Oct 01 14:44:37 crc kubenswrapper[4749]: I1001 14:44:37.869544 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/util/0.log" Oct 01 14:44:37 crc kubenswrapper[4749]: I1001 14:44:37.899316 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/extract/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.026813 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/util/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.245322 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/util/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.291443 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/pull/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.296532 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/pull/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.422188 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/util/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.426423 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/pull/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.454543 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/extract/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.588438 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/extract-utilities/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.764247 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/extract-content/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.777601 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/extract-utilities/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.803639 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/extract-content/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.932210 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/extract-utilities/0.log" Oct 01 14:44:38 crc kubenswrapper[4749]: I1001 14:44:38.961743 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/extract-content/0.log" Oct 01 14:44:39 crc kubenswrapper[4749]: I1001 14:44:39.112138 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/extract-utilities/0.log" Oct 01 14:44:39 crc kubenswrapper[4749]: I1001 14:44:39.229567 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:44:39 crc kubenswrapper[4749]: E1001 14:44:39.229824 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:44:39 crc kubenswrapper[4749]: I1001 14:44:39.396647 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/extract-content/0.log" Oct 01 14:44:39 crc kubenswrapper[4749]: I1001 14:44:39.431282 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/extract-utilities/0.log" Oct 01 14:44:39 crc kubenswrapper[4749]: I1001 14:44:39.432099 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/extract-content/0.log" Oct 01 14:44:39 crc kubenswrapper[4749]: I1001 14:44:39.639912 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/extract-utilities/0.log" Oct 01 14:44:39 crc kubenswrapper[4749]: I1001 14:44:39.679405 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/extract-content/0.log" Oct 01 14:44:39 crc kubenswrapper[4749]: I1001 14:44:39.739461 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/registry-server/0.log" Oct 01 14:44:39 crc kubenswrapper[4749]: I1001 14:44:39.947579 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/util/0.log" Oct 01 14:44:40 crc kubenswrapper[4749]: I1001 14:44:40.143734 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/pull/0.log" Oct 01 14:44:40 crc kubenswrapper[4749]: I1001 14:44:40.171389 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/pull/0.log" Oct 01 14:44:40 crc kubenswrapper[4749]: I1001 14:44:40.174555 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/util/0.log" Oct 01 14:44:40 crc kubenswrapper[4749]: I1001 14:44:40.396919 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/pull/0.log" Oct 01 14:44:40 crc kubenswrapper[4749]: I1001 14:44:40.420563 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/util/0.log" Oct 01 14:44:40 crc kubenswrapper[4749]: I1001 14:44:40.474137 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/extract/0.log" Oct 01 14:44:40 crc kubenswrapper[4749]: I1001 14:44:40.486668 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/registry-server/0.log" Oct 01 14:44:40 crc kubenswrapper[4749]: I1001 14:44:40.590238 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rc7nj_87ce5873-e490-470b-8324-be053c551acb/marketplace-operator/0.log" Oct 01 14:44:40 crc kubenswrapper[4749]: I1001 14:44:40.685622 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/extract-utilities/0.log" Oct 01 14:44:40 crc kubenswrapper[4749]: I1001 14:44:40.875417 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/extract-content/0.log" Oct 01 14:44:40 crc kubenswrapper[4749]: I1001 14:44:40.886471 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/extract-content/0.log" Oct 01 14:44:40 crc kubenswrapper[4749]: I1001 14:44:40.914641 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/extract-utilities/0.log" Oct 01 14:44:41 crc kubenswrapper[4749]: I1001 14:44:41.069443 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/extract-utilities/0.log" Oct 01 14:44:41 crc kubenswrapper[4749]: I1001 14:44:41.073854 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/extract-content/0.log" Oct 01 14:44:41 crc kubenswrapper[4749]: I1001 14:44:41.090991 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/extract-utilities/0.log" Oct 01 14:44:41 crc kubenswrapper[4749]: I1001 14:44:41.351627 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/extract-utilities/0.log" Oct 01 14:44:41 crc kubenswrapper[4749]: I1001 14:44:41.353097 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/extract-content/0.log" Oct 01 14:44:41 crc kubenswrapper[4749]: I1001 14:44:41.362084 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/registry-server/0.log" Oct 01 14:44:41 crc kubenswrapper[4749]: I1001 14:44:41.409105 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/extract-content/0.log" Oct 01 14:44:41 crc kubenswrapper[4749]: I1001 14:44:41.536472 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/extract-utilities/0.log" Oct 01 14:44:41 crc kubenswrapper[4749]: I1001 14:44:41.578615 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/extract-content/0.log" Oct 01 14:44:42 crc kubenswrapper[4749]: I1001 14:44:42.294008 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/registry-server/0.log" Oct 01 14:44:53 crc kubenswrapper[4749]: I1001 14:44:53.561060 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-jmcm2_74ad4854-4091-4485-b4da-881846999f3b/prometheus-operator/0.log" Oct 01 14:44:53 crc kubenswrapper[4749]: I1001 14:44:53.699912 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4_2393602c-447f-4166-8ce8-7cb58c8d5510/prometheus-operator-admission-webhook/0.log" Oct 01 14:44:53 crc kubenswrapper[4749]: I1001 14:44:53.729147 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn_f69075a9-9209-4d56-8111-2bdcd4dc52e6/prometheus-operator-admission-webhook/0.log" Oct 01 14:44:53 crc kubenswrapper[4749]: I1001 14:44:53.894609 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-v579p_698fb753-09d8-462d-a57d-95b1cb6bae9a/operator/0.log" Oct 01 14:44:53 crc kubenswrapper[4749]: I1001 14:44:53.929203 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-995fk_82f26847-56e4-48f0-b990-e2f4e8c9cfd6/perses-operator/0.log" Oct 01 14:44:54 crc kubenswrapper[4749]: I1001 14:44:54.230314 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:44:54 crc kubenswrapper[4749]: E1001 14:44:54.230608 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.173394 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh"] Oct 01 14:45:00 crc kubenswrapper[4749]: E1001 14:45:00.174270 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d61a6b-d0be-4424-bff1-edcf848d8952" containerName="registry-server" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.174284 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d61a6b-d0be-4424-bff1-edcf848d8952" containerName="registry-server" Oct 01 14:45:00 crc kubenswrapper[4749]: E1001 14:45:00.174296 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4c118c8-655c-4c98-9846-bc80a520cdb4" containerName="container-00" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.174301 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4c118c8-655c-4c98-9846-bc80a520cdb4" containerName="container-00" Oct 01 14:45:00 crc kubenswrapper[4749]: E1001 14:45:00.174365 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d61a6b-d0be-4424-bff1-edcf848d8952" containerName="extract-utilities" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.174373 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d61a6b-d0be-4424-bff1-edcf848d8952" containerName="extract-utilities" Oct 01 14:45:00 crc kubenswrapper[4749]: E1001 14:45:00.174387 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d61a6b-d0be-4424-bff1-edcf848d8952" containerName="extract-content" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.174392 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d61a6b-d0be-4424-bff1-edcf848d8952" containerName="extract-content" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.174584 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4c118c8-655c-4c98-9846-bc80a520cdb4" containerName="container-00" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.174597 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d61a6b-d0be-4424-bff1-edcf848d8952" containerName="registry-server" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.175252 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.184351 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.184422 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.191817 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh"] Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.275683 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-config-volume\") pod \"collect-profiles-29322165-x64jh\" (UID: \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.275777 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-secret-volume\") pod \"collect-profiles-29322165-x64jh\" (UID: \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.275951 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwtgg\" (UniqueName: \"kubernetes.io/projected/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-kube-api-access-cwtgg\") pod \"collect-profiles-29322165-x64jh\" (UID: \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.377568 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-config-volume\") pod \"collect-profiles-29322165-x64jh\" (UID: \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.377676 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-secret-volume\") pod \"collect-profiles-29322165-x64jh\" (UID: \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.377769 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwtgg\" (UniqueName: \"kubernetes.io/projected/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-kube-api-access-cwtgg\") pod \"collect-profiles-29322165-x64jh\" (UID: \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.378933 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-config-volume\") pod \"collect-profiles-29322165-x64jh\" (UID: \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.387201 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-secret-volume\") pod \"collect-profiles-29322165-x64jh\" (UID: \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.394777 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwtgg\" (UniqueName: \"kubernetes.io/projected/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-kube-api-access-cwtgg\") pod \"collect-profiles-29322165-x64jh\" (UID: \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:00 crc kubenswrapper[4749]: I1001 14:45:00.502093 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:01 crc kubenswrapper[4749]: I1001 14:45:01.085382 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh"] Oct 01 14:45:01 crc kubenswrapper[4749]: I1001 14:45:01.210011 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" event={"ID":"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff","Type":"ContainerStarted","Data":"cf3957099b65e4c701a9c3315c626bff2bc28ac7f17fa544fd5dfa93a798b6b9"} Oct 01 14:45:02 crc kubenswrapper[4749]: I1001 14:45:02.237174 4749 generic.go:334] "Generic (PLEG): container finished" podID="7cf06aa0-bd9b-4197-ad3c-07dbf48319ff" containerID="f83b3e369bee7315ef27bbf80606794116dd3d98bb1f2bfc0535fd94cd461c37" exitCode=0 Oct 01 14:45:02 crc kubenswrapper[4749]: I1001 14:45:02.237256 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" event={"ID":"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff","Type":"ContainerDied","Data":"f83b3e369bee7315ef27bbf80606794116dd3d98bb1f2bfc0535fd94cd461c37"} Oct 01 14:45:03 crc kubenswrapper[4749]: I1001 14:45:03.674860 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:03 crc kubenswrapper[4749]: I1001 14:45:03.848026 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-config-volume\") pod \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\" (UID: \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\") " Oct 01 14:45:03 crc kubenswrapper[4749]: I1001 14:45:03.848146 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-secret-volume\") pod \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\" (UID: \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\") " Oct 01 14:45:03 crc kubenswrapper[4749]: I1001 14:45:03.848203 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwtgg\" (UniqueName: \"kubernetes.io/projected/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-kube-api-access-cwtgg\") pod \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\" (UID: \"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff\") " Oct 01 14:45:03 crc kubenswrapper[4749]: I1001 14:45:03.850075 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-config-volume" (OuterVolumeSpecName: "config-volume") pod "7cf06aa0-bd9b-4197-ad3c-07dbf48319ff" (UID: "7cf06aa0-bd9b-4197-ad3c-07dbf48319ff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:45:03 crc kubenswrapper[4749]: I1001 14:45:03.857704 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7cf06aa0-bd9b-4197-ad3c-07dbf48319ff" (UID: "7cf06aa0-bd9b-4197-ad3c-07dbf48319ff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:45:03 crc kubenswrapper[4749]: I1001 14:45:03.864572 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-kube-api-access-cwtgg" (OuterVolumeSpecName: "kube-api-access-cwtgg") pod "7cf06aa0-bd9b-4197-ad3c-07dbf48319ff" (UID: "7cf06aa0-bd9b-4197-ad3c-07dbf48319ff"). InnerVolumeSpecName "kube-api-access-cwtgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:45:03 crc kubenswrapper[4749]: I1001 14:45:03.951346 4749 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:45:03 crc kubenswrapper[4749]: I1001 14:45:03.951682 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwtgg\" (UniqueName: \"kubernetes.io/projected/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-kube-api-access-cwtgg\") on node \"crc\" DevicePath \"\"" Oct 01 14:45:03 crc kubenswrapper[4749]: I1001 14:45:03.951787 4749 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cf06aa0-bd9b-4197-ad3c-07dbf48319ff-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:45:04 crc kubenswrapper[4749]: I1001 14:45:04.279135 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" event={"ID":"7cf06aa0-bd9b-4197-ad3c-07dbf48319ff","Type":"ContainerDied","Data":"cf3957099b65e4c701a9c3315c626bff2bc28ac7f17fa544fd5dfa93a798b6b9"} Oct 01 14:45:04 crc kubenswrapper[4749]: I1001 14:45:04.279170 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf3957099b65e4c701a9c3315c626bff2bc28ac7f17fa544fd5dfa93a798b6b9" Oct 01 14:45:04 crc kubenswrapper[4749]: I1001 14:45:04.279444 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-x64jh" Oct 01 14:45:04 crc kubenswrapper[4749]: I1001 14:45:04.745574 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn"] Oct 01 14:45:04 crc kubenswrapper[4749]: I1001 14:45:04.754464 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322120-7gvzn"] Oct 01 14:45:05 crc kubenswrapper[4749]: I1001 14:45:05.230924 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:45:05 crc kubenswrapper[4749]: I1001 14:45:05.243704 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a20b509-c4f1-469c-a228-d779f1a05c9d" path="/var/lib/kubelet/pods/7a20b509-c4f1-469c-a228-d779f1a05c9d/volumes" Oct 01 14:45:06 crc kubenswrapper[4749]: I1001 14:45:06.299328 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"d4cf5263ca8bafbd9edc3554b2592b9589b7975d8a139c3dfc769d7079e44546"} Oct 01 14:45:11 crc kubenswrapper[4749]: I1001 14:45:11.555679 4749 scope.go:117] "RemoveContainer" containerID="a21f49b8e804f53cd8d3e978b2129d45d5e3ab958cf8fa5871ac45f9b24addca" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.338718 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lfpfx"] Oct 01 14:45:20 crc kubenswrapper[4749]: E1001 14:45:20.340044 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf06aa0-bd9b-4197-ad3c-07dbf48319ff" containerName="collect-profiles" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.340062 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf06aa0-bd9b-4197-ad3c-07dbf48319ff" containerName="collect-profiles" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.340339 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf06aa0-bd9b-4197-ad3c-07dbf48319ff" containerName="collect-profiles" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.342197 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.365864 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfpfx"] Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.503583 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72622a66-c6fb-4d32-af90-a7f10b0e1f94-utilities\") pod \"community-operators-lfpfx\" (UID: \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\") " pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.503723 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6trzk\" (UniqueName: \"kubernetes.io/projected/72622a66-c6fb-4d32-af90-a7f10b0e1f94-kube-api-access-6trzk\") pod \"community-operators-lfpfx\" (UID: \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\") " pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.503861 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72622a66-c6fb-4d32-af90-a7f10b0e1f94-catalog-content\") pod \"community-operators-lfpfx\" (UID: \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\") " pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.605445 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72622a66-c6fb-4d32-af90-a7f10b0e1f94-catalog-content\") pod \"community-operators-lfpfx\" (UID: \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\") " pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.605515 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72622a66-c6fb-4d32-af90-a7f10b0e1f94-utilities\") pod \"community-operators-lfpfx\" (UID: \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\") " pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.605573 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6trzk\" (UniqueName: \"kubernetes.io/projected/72622a66-c6fb-4d32-af90-a7f10b0e1f94-kube-api-access-6trzk\") pod \"community-operators-lfpfx\" (UID: \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\") " pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.606004 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72622a66-c6fb-4d32-af90-a7f10b0e1f94-catalog-content\") pod \"community-operators-lfpfx\" (UID: \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\") " pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.606145 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72622a66-c6fb-4d32-af90-a7f10b0e1f94-utilities\") pod \"community-operators-lfpfx\" (UID: \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\") " pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.629051 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6trzk\" (UniqueName: \"kubernetes.io/projected/72622a66-c6fb-4d32-af90-a7f10b0e1f94-kube-api-access-6trzk\") pod \"community-operators-lfpfx\" (UID: \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\") " pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:20 crc kubenswrapper[4749]: I1001 14:45:20.665992 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:21 crc kubenswrapper[4749]: I1001 14:45:21.202158 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lfpfx"] Oct 01 14:45:21 crc kubenswrapper[4749]: I1001 14:45:21.465861 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfpfx" event={"ID":"72622a66-c6fb-4d32-af90-a7f10b0e1f94","Type":"ContainerStarted","Data":"98dc78976df10c5888286e1107f4cd5390b24ac5c7bba9dda3e11940aec548bf"} Oct 01 14:45:21 crc kubenswrapper[4749]: I1001 14:45:21.465928 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfpfx" event={"ID":"72622a66-c6fb-4d32-af90-a7f10b0e1f94","Type":"ContainerStarted","Data":"e8de3a5dd1e1d3afd059f7c813227b9ba0c7860475a180b3a0ba9bd2c2f92676"} Oct 01 14:45:22 crc kubenswrapper[4749]: I1001 14:45:22.483241 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfpfx" event={"ID":"72622a66-c6fb-4d32-af90-a7f10b0e1f94","Type":"ContainerDied","Data":"98dc78976df10c5888286e1107f4cd5390b24ac5c7bba9dda3e11940aec548bf"} Oct 01 14:45:22 crc kubenswrapper[4749]: I1001 14:45:22.483168 4749 generic.go:334] "Generic (PLEG): container finished" podID="72622a66-c6fb-4d32-af90-a7f10b0e1f94" containerID="98dc78976df10c5888286e1107f4cd5390b24ac5c7bba9dda3e11940aec548bf" exitCode=0 Oct 01 14:45:22 crc kubenswrapper[4749]: I1001 14:45:22.488778 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:45:23 crc kubenswrapper[4749]: I1001 14:45:23.495983 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfpfx" event={"ID":"72622a66-c6fb-4d32-af90-a7f10b0e1f94","Type":"ContainerStarted","Data":"7c4453cdc65cad9ac351256a5d6730b7e2b15d43596818828fb79b6b302e194d"} Oct 01 14:45:25 crc kubenswrapper[4749]: I1001 14:45:25.521535 4749 generic.go:334] "Generic (PLEG): container finished" podID="72622a66-c6fb-4d32-af90-a7f10b0e1f94" containerID="7c4453cdc65cad9ac351256a5d6730b7e2b15d43596818828fb79b6b302e194d" exitCode=0 Oct 01 14:45:25 crc kubenswrapper[4749]: I1001 14:45:25.521651 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfpfx" event={"ID":"72622a66-c6fb-4d32-af90-a7f10b0e1f94","Type":"ContainerDied","Data":"7c4453cdc65cad9ac351256a5d6730b7e2b15d43596818828fb79b6b302e194d"} Oct 01 14:45:26 crc kubenswrapper[4749]: I1001 14:45:26.537010 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfpfx" event={"ID":"72622a66-c6fb-4d32-af90-a7f10b0e1f94","Type":"ContainerStarted","Data":"6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2"} Oct 01 14:45:30 crc kubenswrapper[4749]: I1001 14:45:30.667348 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:30 crc kubenswrapper[4749]: I1001 14:45:30.667908 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:30 crc kubenswrapper[4749]: I1001 14:45:30.743406 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:30 crc kubenswrapper[4749]: I1001 14:45:30.774106 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lfpfx" podStartSLOduration=7.164682055 podStartE2EDuration="10.774075958s" podCreationTimestamp="2025-10-01 14:45:20 +0000 UTC" firstStartedPulling="2025-10-01 14:45:22.488509763 +0000 UTC m=+5982.542494662" lastFinishedPulling="2025-10-01 14:45:26.097903666 +0000 UTC m=+5986.151888565" observedRunningTime="2025-10-01 14:45:26.572747771 +0000 UTC m=+5986.626732690" watchObservedRunningTime="2025-10-01 14:45:30.774075958 +0000 UTC m=+5990.828060887" Oct 01 14:45:31 crc kubenswrapper[4749]: I1001 14:45:31.655159 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:31 crc kubenswrapper[4749]: I1001 14:45:31.700861 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lfpfx"] Oct 01 14:45:33 crc kubenswrapper[4749]: I1001 14:45:33.621784 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lfpfx" podUID="72622a66-c6fb-4d32-af90-a7f10b0e1f94" containerName="registry-server" containerID="cri-o://6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2" gracePeriod=2 Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.232424 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.420510 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6trzk\" (UniqueName: \"kubernetes.io/projected/72622a66-c6fb-4d32-af90-a7f10b0e1f94-kube-api-access-6trzk\") pod \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\" (UID: \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\") " Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.420671 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72622a66-c6fb-4d32-af90-a7f10b0e1f94-catalog-content\") pod \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\" (UID: \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\") " Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.420746 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72622a66-c6fb-4d32-af90-a7f10b0e1f94-utilities\") pod \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\" (UID: \"72622a66-c6fb-4d32-af90-a7f10b0e1f94\") " Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.424001 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72622a66-c6fb-4d32-af90-a7f10b0e1f94-utilities" (OuterVolumeSpecName: "utilities") pod "72622a66-c6fb-4d32-af90-a7f10b0e1f94" (UID: "72622a66-c6fb-4d32-af90-a7f10b0e1f94"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.432514 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72622a66-c6fb-4d32-af90-a7f10b0e1f94-kube-api-access-6trzk" (OuterVolumeSpecName: "kube-api-access-6trzk") pod "72622a66-c6fb-4d32-af90-a7f10b0e1f94" (UID: "72622a66-c6fb-4d32-af90-a7f10b0e1f94"). InnerVolumeSpecName "kube-api-access-6trzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.524122 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6trzk\" (UniqueName: \"kubernetes.io/projected/72622a66-c6fb-4d32-af90-a7f10b0e1f94-kube-api-access-6trzk\") on node \"crc\" DevicePath \"\"" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.524890 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72622a66-c6fb-4d32-af90-a7f10b0e1f94-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.642595 4749 generic.go:334] "Generic (PLEG): container finished" podID="72622a66-c6fb-4d32-af90-a7f10b0e1f94" containerID="6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2" exitCode=0 Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.642640 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfpfx" event={"ID":"72622a66-c6fb-4d32-af90-a7f10b0e1f94","Type":"ContainerDied","Data":"6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2"} Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.642671 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lfpfx" event={"ID":"72622a66-c6fb-4d32-af90-a7f10b0e1f94","Type":"ContainerDied","Data":"e8de3a5dd1e1d3afd059f7c813227b9ba0c7860475a180b3a0ba9bd2c2f92676"} Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.642693 4749 scope.go:117] "RemoveContainer" containerID="6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.642745 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lfpfx" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.644893 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72622a66-c6fb-4d32-af90-a7f10b0e1f94-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72622a66-c6fb-4d32-af90-a7f10b0e1f94" (UID: "72622a66-c6fb-4d32-af90-a7f10b0e1f94"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.675536 4749 scope.go:117] "RemoveContainer" containerID="7c4453cdc65cad9ac351256a5d6730b7e2b15d43596818828fb79b6b302e194d" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.713826 4749 scope.go:117] "RemoveContainer" containerID="98dc78976df10c5888286e1107f4cd5390b24ac5c7bba9dda3e11940aec548bf" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.728035 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72622a66-c6fb-4d32-af90-a7f10b0e1f94-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.751282 4749 scope.go:117] "RemoveContainer" containerID="6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2" Oct 01 14:45:34 crc kubenswrapper[4749]: E1001 14:45:34.751661 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2\": container with ID starting with 6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2 not found: ID does not exist" containerID="6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.751692 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2"} err="failed to get container status \"6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2\": rpc error: code = NotFound desc = could not find container \"6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2\": container with ID starting with 6664e2474db6c68348f94bcc68803360bbb74520fd31ef2dde00b96c3aaf2de2 not found: ID does not exist" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.751713 4749 scope.go:117] "RemoveContainer" containerID="7c4453cdc65cad9ac351256a5d6730b7e2b15d43596818828fb79b6b302e194d" Oct 01 14:45:34 crc kubenswrapper[4749]: E1001 14:45:34.751940 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4453cdc65cad9ac351256a5d6730b7e2b15d43596818828fb79b6b302e194d\": container with ID starting with 7c4453cdc65cad9ac351256a5d6730b7e2b15d43596818828fb79b6b302e194d not found: ID does not exist" containerID="7c4453cdc65cad9ac351256a5d6730b7e2b15d43596818828fb79b6b302e194d" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.751964 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4453cdc65cad9ac351256a5d6730b7e2b15d43596818828fb79b6b302e194d"} err="failed to get container status \"7c4453cdc65cad9ac351256a5d6730b7e2b15d43596818828fb79b6b302e194d\": rpc error: code = NotFound desc = could not find container \"7c4453cdc65cad9ac351256a5d6730b7e2b15d43596818828fb79b6b302e194d\": container with ID starting with 7c4453cdc65cad9ac351256a5d6730b7e2b15d43596818828fb79b6b302e194d not found: ID does not exist" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.751978 4749 scope.go:117] "RemoveContainer" containerID="98dc78976df10c5888286e1107f4cd5390b24ac5c7bba9dda3e11940aec548bf" Oct 01 14:45:34 crc kubenswrapper[4749]: E1001 14:45:34.752206 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98dc78976df10c5888286e1107f4cd5390b24ac5c7bba9dda3e11940aec548bf\": container with ID starting with 98dc78976df10c5888286e1107f4cd5390b24ac5c7bba9dda3e11940aec548bf not found: ID does not exist" containerID="98dc78976df10c5888286e1107f4cd5390b24ac5c7bba9dda3e11940aec548bf" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.752236 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98dc78976df10c5888286e1107f4cd5390b24ac5c7bba9dda3e11940aec548bf"} err="failed to get container status \"98dc78976df10c5888286e1107f4cd5390b24ac5c7bba9dda3e11940aec548bf\": rpc error: code = NotFound desc = could not find container \"98dc78976df10c5888286e1107f4cd5390b24ac5c7bba9dda3e11940aec548bf\": container with ID starting with 98dc78976df10c5888286e1107f4cd5390b24ac5c7bba9dda3e11940aec548bf not found: ID does not exist" Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.981947 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lfpfx"] Oct 01 14:45:34 crc kubenswrapper[4749]: I1001 14:45:34.989670 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lfpfx"] Oct 01 14:45:35 crc kubenswrapper[4749]: I1001 14:45:35.258727 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72622a66-c6fb-4d32-af90-a7f10b0e1f94" path="/var/lib/kubelet/pods/72622a66-c6fb-4d32-af90-a7f10b0e1f94/volumes" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.253553 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mwsbg"] Oct 01 14:47:11 crc kubenswrapper[4749]: E1001 14:47:11.255515 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72622a66-c6fb-4d32-af90-a7f10b0e1f94" containerName="extract-utilities" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.255605 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="72622a66-c6fb-4d32-af90-a7f10b0e1f94" containerName="extract-utilities" Oct 01 14:47:11 crc kubenswrapper[4749]: E1001 14:47:11.255668 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72622a66-c6fb-4d32-af90-a7f10b0e1f94" containerName="registry-server" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.255721 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="72622a66-c6fb-4d32-af90-a7f10b0e1f94" containerName="registry-server" Oct 01 14:47:11 crc kubenswrapper[4749]: E1001 14:47:11.255781 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72622a66-c6fb-4d32-af90-a7f10b0e1f94" containerName="extract-content" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.255831 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="72622a66-c6fb-4d32-af90-a7f10b0e1f94" containerName="extract-content" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.256111 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="72622a66-c6fb-4d32-af90-a7f10b0e1f94" containerName="registry-server" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.257508 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwsbg"] Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.257698 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.414184 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38ca57b-5abd-415b-af85-523537a8ad49-catalog-content\") pod \"certified-operators-mwsbg\" (UID: \"d38ca57b-5abd-415b-af85-523537a8ad49\") " pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.414347 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcldh\" (UniqueName: \"kubernetes.io/projected/d38ca57b-5abd-415b-af85-523537a8ad49-kube-api-access-qcldh\") pod \"certified-operators-mwsbg\" (UID: \"d38ca57b-5abd-415b-af85-523537a8ad49\") " pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.414396 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38ca57b-5abd-415b-af85-523537a8ad49-utilities\") pod \"certified-operators-mwsbg\" (UID: \"d38ca57b-5abd-415b-af85-523537a8ad49\") " pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.515949 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcldh\" (UniqueName: \"kubernetes.io/projected/d38ca57b-5abd-415b-af85-523537a8ad49-kube-api-access-qcldh\") pod \"certified-operators-mwsbg\" (UID: \"d38ca57b-5abd-415b-af85-523537a8ad49\") " pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.516353 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38ca57b-5abd-415b-af85-523537a8ad49-utilities\") pod \"certified-operators-mwsbg\" (UID: \"d38ca57b-5abd-415b-af85-523537a8ad49\") " pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.516482 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38ca57b-5abd-415b-af85-523537a8ad49-catalog-content\") pod \"certified-operators-mwsbg\" (UID: \"d38ca57b-5abd-415b-af85-523537a8ad49\") " pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.517110 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38ca57b-5abd-415b-af85-523537a8ad49-catalog-content\") pod \"certified-operators-mwsbg\" (UID: \"d38ca57b-5abd-415b-af85-523537a8ad49\") " pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.517113 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38ca57b-5abd-415b-af85-523537a8ad49-utilities\") pod \"certified-operators-mwsbg\" (UID: \"d38ca57b-5abd-415b-af85-523537a8ad49\") " pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.546179 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcldh\" (UniqueName: \"kubernetes.io/projected/d38ca57b-5abd-415b-af85-523537a8ad49-kube-api-access-qcldh\") pod \"certified-operators-mwsbg\" (UID: \"d38ca57b-5abd-415b-af85-523537a8ad49\") " pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.589075 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.685288 4749 scope.go:117] "RemoveContainer" containerID="cb3b464281fa1657283409f5cf05e8cfdc8e2623fcbd79af2a764289b7db20b3" Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.822568 4749 generic.go:334] "Generic (PLEG): container finished" podID="c3367109-f7d7-4296-9717-853f4700b93a" containerID="8fcb2d37fdc00caecad3ba3efbd05b2056b53f09aaedab1758199d95a68b69ac" exitCode=0 Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.822611 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqrjw/must-gather-xdlwh" event={"ID":"c3367109-f7d7-4296-9717-853f4700b93a","Type":"ContainerDied","Data":"8fcb2d37fdc00caecad3ba3efbd05b2056b53f09aaedab1758199d95a68b69ac"} Oct 01 14:47:11 crc kubenswrapper[4749]: I1001 14:47:11.823292 4749 scope.go:117] "RemoveContainer" containerID="8fcb2d37fdc00caecad3ba3efbd05b2056b53f09aaedab1758199d95a68b69ac" Oct 01 14:47:12 crc kubenswrapper[4749]: I1001 14:47:12.135680 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwsbg"] Oct 01 14:47:12 crc kubenswrapper[4749]: I1001 14:47:12.448998 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lqrjw_must-gather-xdlwh_c3367109-f7d7-4296-9717-853f4700b93a/gather/0.log" Oct 01 14:47:12 crc kubenswrapper[4749]: I1001 14:47:12.840014 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwsbg" event={"ID":"d38ca57b-5abd-415b-af85-523537a8ad49","Type":"ContainerDied","Data":"828f8ddcc13572ae38b162eb1003e923acfc50654a964815683b09baa0301dbb"} Oct 01 14:47:12 crc kubenswrapper[4749]: I1001 14:47:12.839865 4749 generic.go:334] "Generic (PLEG): container finished" podID="d38ca57b-5abd-415b-af85-523537a8ad49" containerID="828f8ddcc13572ae38b162eb1003e923acfc50654a964815683b09baa0301dbb" exitCode=0 Oct 01 14:47:12 crc kubenswrapper[4749]: I1001 14:47:12.841849 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwsbg" event={"ID":"d38ca57b-5abd-415b-af85-523537a8ad49","Type":"ContainerStarted","Data":"8397ff7644847687ba652a1cf597ff5991e7bab3019e31e976a835cbb798f4d5"} Oct 01 14:47:14 crc kubenswrapper[4749]: I1001 14:47:14.876163 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwsbg" event={"ID":"d38ca57b-5abd-415b-af85-523537a8ad49","Type":"ContainerStarted","Data":"3b1befd960e1eb7119b85a1d2ce25e397e2645c0a63f335cb11f98524e02386b"} Oct 01 14:47:15 crc kubenswrapper[4749]: I1001 14:47:15.891116 4749 generic.go:334] "Generic (PLEG): container finished" podID="d38ca57b-5abd-415b-af85-523537a8ad49" containerID="3b1befd960e1eb7119b85a1d2ce25e397e2645c0a63f335cb11f98524e02386b" exitCode=0 Oct 01 14:47:15 crc kubenswrapper[4749]: I1001 14:47:15.891463 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwsbg" event={"ID":"d38ca57b-5abd-415b-af85-523537a8ad49","Type":"ContainerDied","Data":"3b1befd960e1eb7119b85a1d2ce25e397e2645c0a63f335cb11f98524e02386b"} Oct 01 14:47:16 crc kubenswrapper[4749]: I1001 14:47:16.926814 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwsbg" event={"ID":"d38ca57b-5abd-415b-af85-523537a8ad49","Type":"ContainerStarted","Data":"3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e"} Oct 01 14:47:16 crc kubenswrapper[4749]: I1001 14:47:16.948829 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mwsbg" podStartSLOduration=2.420924862 podStartE2EDuration="5.948810133s" podCreationTimestamp="2025-10-01 14:47:11 +0000 UTC" firstStartedPulling="2025-10-01 14:47:12.843000982 +0000 UTC m=+6092.896985891" lastFinishedPulling="2025-10-01 14:47:16.370886253 +0000 UTC m=+6096.424871162" observedRunningTime="2025-10-01 14:47:16.941893683 +0000 UTC m=+6096.995878582" watchObservedRunningTime="2025-10-01 14:47:16.948810133 +0000 UTC m=+6097.002795032" Oct 01 14:47:21 crc kubenswrapper[4749]: I1001 14:47:21.590160 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:21 crc kubenswrapper[4749]: I1001 14:47:21.590819 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:21 crc kubenswrapper[4749]: I1001 14:47:21.611405 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lqrjw/must-gather-xdlwh"] Oct 01 14:47:21 crc kubenswrapper[4749]: I1001 14:47:21.611953 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lqrjw/must-gather-xdlwh" podUID="c3367109-f7d7-4296-9717-853f4700b93a" containerName="copy" containerID="cri-o://7480aed2048c05b8a28a51b2ee3358c7a52ed2d7b02491e36218a3cfdd038cc0" gracePeriod=2 Oct 01 14:47:21 crc kubenswrapper[4749]: I1001 14:47:21.632558 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lqrjw/must-gather-xdlwh"] Oct 01 14:47:21 crc kubenswrapper[4749]: I1001 14:47:21.659321 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:21 crc kubenswrapper[4749]: I1001 14:47:21.989858 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lqrjw_must-gather-xdlwh_c3367109-f7d7-4296-9717-853f4700b93a/copy/0.log" Oct 01 14:47:21 crc kubenswrapper[4749]: I1001 14:47:21.990610 4749 generic.go:334] "Generic (PLEG): container finished" podID="c3367109-f7d7-4296-9717-853f4700b93a" containerID="7480aed2048c05b8a28a51b2ee3358c7a52ed2d7b02491e36218a3cfdd038cc0" exitCode=143 Oct 01 14:47:21 crc kubenswrapper[4749]: I1001 14:47:21.990679 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e3bcec0423d7ad36843ef6cd72157ae49702356b2f110ca18df948dfe543c55" Oct 01 14:47:22 crc kubenswrapper[4749]: I1001 14:47:22.045345 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:22 crc kubenswrapper[4749]: I1001 14:47:22.088303 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lqrjw_must-gather-xdlwh_c3367109-f7d7-4296-9717-853f4700b93a/copy/0.log" Oct 01 14:47:22 crc kubenswrapper[4749]: I1001 14:47:22.088732 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/must-gather-xdlwh" Oct 01 14:47:22 crc kubenswrapper[4749]: I1001 14:47:22.121539 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwsbg"] Oct 01 14:47:22 crc kubenswrapper[4749]: I1001 14:47:22.265645 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwv6g\" (UniqueName: \"kubernetes.io/projected/c3367109-f7d7-4296-9717-853f4700b93a-kube-api-access-cwv6g\") pod \"c3367109-f7d7-4296-9717-853f4700b93a\" (UID: \"c3367109-f7d7-4296-9717-853f4700b93a\") " Oct 01 14:47:22 crc kubenswrapper[4749]: I1001 14:47:22.266013 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c3367109-f7d7-4296-9717-853f4700b93a-must-gather-output\") pod \"c3367109-f7d7-4296-9717-853f4700b93a\" (UID: \"c3367109-f7d7-4296-9717-853f4700b93a\") " Oct 01 14:47:22 crc kubenswrapper[4749]: I1001 14:47:22.270921 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3367109-f7d7-4296-9717-853f4700b93a-kube-api-access-cwv6g" (OuterVolumeSpecName: "kube-api-access-cwv6g") pod "c3367109-f7d7-4296-9717-853f4700b93a" (UID: "c3367109-f7d7-4296-9717-853f4700b93a"). InnerVolumeSpecName "kube-api-access-cwv6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:47:22 crc kubenswrapper[4749]: I1001 14:47:22.367924 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwv6g\" (UniqueName: \"kubernetes.io/projected/c3367109-f7d7-4296-9717-853f4700b93a-kube-api-access-cwv6g\") on node \"crc\" DevicePath \"\"" Oct 01 14:47:22 crc kubenswrapper[4749]: I1001 14:47:22.450695 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3367109-f7d7-4296-9717-853f4700b93a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c3367109-f7d7-4296-9717-853f4700b93a" (UID: "c3367109-f7d7-4296-9717-853f4700b93a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:47:22 crc kubenswrapper[4749]: I1001 14:47:22.470015 4749 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c3367109-f7d7-4296-9717-853f4700b93a-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 14:47:23 crc kubenswrapper[4749]: I1001 14:47:23.003585 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqrjw/must-gather-xdlwh" Oct 01 14:47:23 crc kubenswrapper[4749]: I1001 14:47:23.244953 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3367109-f7d7-4296-9717-853f4700b93a" path="/var/lib/kubelet/pods/c3367109-f7d7-4296-9717-853f4700b93a/volumes" Oct 01 14:47:24 crc kubenswrapper[4749]: I1001 14:47:24.011927 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mwsbg" podUID="d38ca57b-5abd-415b-af85-523537a8ad49" containerName="registry-server" containerID="cri-o://3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e" gracePeriod=2 Oct 01 14:47:24 crc kubenswrapper[4749]: I1001 14:47:24.486536 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:24 crc kubenswrapper[4749]: I1001 14:47:24.610049 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38ca57b-5abd-415b-af85-523537a8ad49-catalog-content\") pod \"d38ca57b-5abd-415b-af85-523537a8ad49\" (UID: \"d38ca57b-5abd-415b-af85-523537a8ad49\") " Oct 01 14:47:24 crc kubenswrapper[4749]: I1001 14:47:24.610127 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcldh\" (UniqueName: \"kubernetes.io/projected/d38ca57b-5abd-415b-af85-523537a8ad49-kube-api-access-qcldh\") pod \"d38ca57b-5abd-415b-af85-523537a8ad49\" (UID: \"d38ca57b-5abd-415b-af85-523537a8ad49\") " Oct 01 14:47:24 crc kubenswrapper[4749]: I1001 14:47:24.610192 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38ca57b-5abd-415b-af85-523537a8ad49-utilities\") pod \"d38ca57b-5abd-415b-af85-523537a8ad49\" (UID: \"d38ca57b-5abd-415b-af85-523537a8ad49\") " Oct 01 14:47:24 crc kubenswrapper[4749]: I1001 14:47:24.611160 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d38ca57b-5abd-415b-af85-523537a8ad49-utilities" (OuterVolumeSpecName: "utilities") pod "d38ca57b-5abd-415b-af85-523537a8ad49" (UID: "d38ca57b-5abd-415b-af85-523537a8ad49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:47:24 crc kubenswrapper[4749]: I1001 14:47:24.611763 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d38ca57b-5abd-415b-af85-523537a8ad49-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:47:24 crc kubenswrapper[4749]: I1001 14:47:24.618176 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d38ca57b-5abd-415b-af85-523537a8ad49-kube-api-access-qcldh" (OuterVolumeSpecName: "kube-api-access-qcldh") pod "d38ca57b-5abd-415b-af85-523537a8ad49" (UID: "d38ca57b-5abd-415b-af85-523537a8ad49"). InnerVolumeSpecName "kube-api-access-qcldh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:47:24 crc kubenswrapper[4749]: I1001 14:47:24.695835 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d38ca57b-5abd-415b-af85-523537a8ad49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d38ca57b-5abd-415b-af85-523537a8ad49" (UID: "d38ca57b-5abd-415b-af85-523537a8ad49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:47:24 crc kubenswrapper[4749]: I1001 14:47:24.714280 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d38ca57b-5abd-415b-af85-523537a8ad49-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:47:24 crc kubenswrapper[4749]: I1001 14:47:24.714321 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcldh\" (UniqueName: \"kubernetes.io/projected/d38ca57b-5abd-415b-af85-523537a8ad49-kube-api-access-qcldh\") on node \"crc\" DevicePath \"\"" Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.050785 4749 generic.go:334] "Generic (PLEG): container finished" podID="d38ca57b-5abd-415b-af85-523537a8ad49" containerID="3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e" exitCode=0 Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.050840 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwsbg" event={"ID":"d38ca57b-5abd-415b-af85-523537a8ad49","Type":"ContainerDied","Data":"3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e"} Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.050885 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwsbg" Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.050907 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwsbg" event={"ID":"d38ca57b-5abd-415b-af85-523537a8ad49","Type":"ContainerDied","Data":"8397ff7644847687ba652a1cf597ff5991e7bab3019e31e976a835cbb798f4d5"} Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.050930 4749 scope.go:117] "RemoveContainer" containerID="3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e" Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.101387 4749 scope.go:117] "RemoveContainer" containerID="3b1befd960e1eb7119b85a1d2ce25e397e2645c0a63f335cb11f98524e02386b" Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.101927 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwsbg"] Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.120073 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mwsbg"] Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.125511 4749 scope.go:117] "RemoveContainer" containerID="828f8ddcc13572ae38b162eb1003e923acfc50654a964815683b09baa0301dbb" Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.175393 4749 scope.go:117] "RemoveContainer" containerID="3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e" Oct 01 14:47:25 crc kubenswrapper[4749]: E1001 14:47:25.175867 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e\": container with ID starting with 3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e not found: ID does not exist" containerID="3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e" Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.175906 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e"} err="failed to get container status \"3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e\": rpc error: code = NotFound desc = could not find container \"3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e\": container with ID starting with 3553dadff4cca3de3e8a02f95f97f508a485f890b537baae3ec02bb29432082e not found: ID does not exist" Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.175929 4749 scope.go:117] "RemoveContainer" containerID="3b1befd960e1eb7119b85a1d2ce25e397e2645c0a63f335cb11f98524e02386b" Oct 01 14:47:25 crc kubenswrapper[4749]: E1001 14:47:25.176475 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b1befd960e1eb7119b85a1d2ce25e397e2645c0a63f335cb11f98524e02386b\": container with ID starting with 3b1befd960e1eb7119b85a1d2ce25e397e2645c0a63f335cb11f98524e02386b not found: ID does not exist" containerID="3b1befd960e1eb7119b85a1d2ce25e397e2645c0a63f335cb11f98524e02386b" Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.176512 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b1befd960e1eb7119b85a1d2ce25e397e2645c0a63f335cb11f98524e02386b"} err="failed to get container status \"3b1befd960e1eb7119b85a1d2ce25e397e2645c0a63f335cb11f98524e02386b\": rpc error: code = NotFound desc = could not find container \"3b1befd960e1eb7119b85a1d2ce25e397e2645c0a63f335cb11f98524e02386b\": container with ID starting with 3b1befd960e1eb7119b85a1d2ce25e397e2645c0a63f335cb11f98524e02386b not found: ID does not exist" Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.176543 4749 scope.go:117] "RemoveContainer" containerID="828f8ddcc13572ae38b162eb1003e923acfc50654a964815683b09baa0301dbb" Oct 01 14:47:25 crc kubenswrapper[4749]: E1001 14:47:25.176930 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"828f8ddcc13572ae38b162eb1003e923acfc50654a964815683b09baa0301dbb\": container with ID starting with 828f8ddcc13572ae38b162eb1003e923acfc50654a964815683b09baa0301dbb not found: ID does not exist" containerID="828f8ddcc13572ae38b162eb1003e923acfc50654a964815683b09baa0301dbb" Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.176957 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828f8ddcc13572ae38b162eb1003e923acfc50654a964815683b09baa0301dbb"} err="failed to get container status \"828f8ddcc13572ae38b162eb1003e923acfc50654a964815683b09baa0301dbb\": rpc error: code = NotFound desc = could not find container \"828f8ddcc13572ae38b162eb1003e923acfc50654a964815683b09baa0301dbb\": container with ID starting with 828f8ddcc13572ae38b162eb1003e923acfc50654a964815683b09baa0301dbb not found: ID does not exist" Oct 01 14:47:25 crc kubenswrapper[4749]: I1001 14:47:25.242686 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d38ca57b-5abd-415b-af85-523537a8ad49" path="/var/lib/kubelet/pods/d38ca57b-5abd-415b-af85-523537a8ad49/volumes" Oct 01 14:47:32 crc kubenswrapper[4749]: I1001 14:47:32.106656 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:47:32 crc kubenswrapper[4749]: I1001 14:47:32.107099 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:47:47 crc kubenswrapper[4749]: I1001 14:47:47.991067 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qtqrp/must-gather-686kh"] Oct 01 14:47:47 crc kubenswrapper[4749]: E1001 14:47:47.992206 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38ca57b-5abd-415b-af85-523537a8ad49" containerName="extract-content" Oct 01 14:47:47 crc kubenswrapper[4749]: I1001 14:47:47.992245 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38ca57b-5abd-415b-af85-523537a8ad49" containerName="extract-content" Oct 01 14:47:47 crc kubenswrapper[4749]: E1001 14:47:47.992271 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38ca57b-5abd-415b-af85-523537a8ad49" containerName="extract-utilities" Oct 01 14:47:47 crc kubenswrapper[4749]: I1001 14:47:47.992280 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38ca57b-5abd-415b-af85-523537a8ad49" containerName="extract-utilities" Oct 01 14:47:47 crc kubenswrapper[4749]: E1001 14:47:47.992294 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38ca57b-5abd-415b-af85-523537a8ad49" containerName="registry-server" Oct 01 14:47:47 crc kubenswrapper[4749]: I1001 14:47:47.992302 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38ca57b-5abd-415b-af85-523537a8ad49" containerName="registry-server" Oct 01 14:47:47 crc kubenswrapper[4749]: E1001 14:47:47.992321 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3367109-f7d7-4296-9717-853f4700b93a" containerName="gather" Oct 01 14:47:47 crc kubenswrapper[4749]: I1001 14:47:47.992328 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3367109-f7d7-4296-9717-853f4700b93a" containerName="gather" Oct 01 14:47:47 crc kubenswrapper[4749]: E1001 14:47:47.992339 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3367109-f7d7-4296-9717-853f4700b93a" containerName="copy" Oct 01 14:47:47 crc kubenswrapper[4749]: I1001 14:47:47.992347 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3367109-f7d7-4296-9717-853f4700b93a" containerName="copy" Oct 01 14:47:47 crc kubenswrapper[4749]: I1001 14:47:47.992577 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3367109-f7d7-4296-9717-853f4700b93a" containerName="gather" Oct 01 14:47:47 crc kubenswrapper[4749]: I1001 14:47:47.992599 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="d38ca57b-5abd-415b-af85-523537a8ad49" containerName="registry-server" Oct 01 14:47:47 crc kubenswrapper[4749]: I1001 14:47:47.992636 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3367109-f7d7-4296-9717-853f4700b93a" containerName="copy" Oct 01 14:47:47 crc kubenswrapper[4749]: I1001 14:47:47.994371 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/must-gather-686kh" Oct 01 14:47:47 crc kubenswrapper[4749]: I1001 14:47:47.996946 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qtqrp"/"kube-root-ca.crt" Oct 01 14:47:47 crc kubenswrapper[4749]: I1001 14:47:47.998799 4749 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qtqrp"/"openshift-service-ca.crt" Oct 01 14:47:48 crc kubenswrapper[4749]: I1001 14:47:48.011460 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qtqrp/must-gather-686kh"] Oct 01 14:47:48 crc kubenswrapper[4749]: I1001 14:47:48.053105 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dphrq\" (UniqueName: \"kubernetes.io/projected/301b16e9-2377-4efe-a995-e13c0f737bf7-kube-api-access-dphrq\") pod \"must-gather-686kh\" (UID: \"301b16e9-2377-4efe-a995-e13c0f737bf7\") " pod="openshift-must-gather-qtqrp/must-gather-686kh" Oct 01 14:47:48 crc kubenswrapper[4749]: I1001 14:47:48.053465 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/301b16e9-2377-4efe-a995-e13c0f737bf7-must-gather-output\") pod \"must-gather-686kh\" (UID: \"301b16e9-2377-4efe-a995-e13c0f737bf7\") " pod="openshift-must-gather-qtqrp/must-gather-686kh" Oct 01 14:47:48 crc kubenswrapper[4749]: I1001 14:47:48.155597 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dphrq\" (UniqueName: \"kubernetes.io/projected/301b16e9-2377-4efe-a995-e13c0f737bf7-kube-api-access-dphrq\") pod \"must-gather-686kh\" (UID: \"301b16e9-2377-4efe-a995-e13c0f737bf7\") " pod="openshift-must-gather-qtqrp/must-gather-686kh" Oct 01 14:47:48 crc kubenswrapper[4749]: I1001 14:47:48.155656 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/301b16e9-2377-4efe-a995-e13c0f737bf7-must-gather-output\") pod \"must-gather-686kh\" (UID: \"301b16e9-2377-4efe-a995-e13c0f737bf7\") " pod="openshift-must-gather-qtqrp/must-gather-686kh" Oct 01 14:47:48 crc kubenswrapper[4749]: I1001 14:47:48.156374 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/301b16e9-2377-4efe-a995-e13c0f737bf7-must-gather-output\") pod \"must-gather-686kh\" (UID: \"301b16e9-2377-4efe-a995-e13c0f737bf7\") " pod="openshift-must-gather-qtqrp/must-gather-686kh" Oct 01 14:47:48 crc kubenswrapper[4749]: I1001 14:47:48.181348 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dphrq\" (UniqueName: \"kubernetes.io/projected/301b16e9-2377-4efe-a995-e13c0f737bf7-kube-api-access-dphrq\") pod \"must-gather-686kh\" (UID: \"301b16e9-2377-4efe-a995-e13c0f737bf7\") " pod="openshift-must-gather-qtqrp/must-gather-686kh" Oct 01 14:47:48 crc kubenswrapper[4749]: I1001 14:47:48.318101 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/must-gather-686kh" Oct 01 14:47:48 crc kubenswrapper[4749]: I1001 14:47:48.988967 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qtqrp/must-gather-686kh"] Oct 01 14:47:49 crc kubenswrapper[4749]: I1001 14:47:49.343724 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qtqrp/must-gather-686kh" event={"ID":"301b16e9-2377-4efe-a995-e13c0f737bf7","Type":"ContainerStarted","Data":"ba92da98feafedc1820b672dfc18702f25800a44afc58cafc537fc7c87cc7083"} Oct 01 14:47:49 crc kubenswrapper[4749]: I1001 14:47:49.343783 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qtqrp/must-gather-686kh" event={"ID":"301b16e9-2377-4efe-a995-e13c0f737bf7","Type":"ContainerStarted","Data":"beaa6cacb10b83f1155bc4a6b9b19be5f38efa5183e99bc338e7a18188e15642"} Oct 01 14:47:50 crc kubenswrapper[4749]: I1001 14:47:50.357105 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qtqrp/must-gather-686kh" event={"ID":"301b16e9-2377-4efe-a995-e13c0f737bf7","Type":"ContainerStarted","Data":"d80978ee68b4c34c8b0c4cf05e341e825e11fe33bbc8249b0bdd00dc2b05de02"} Oct 01 14:47:50 crc kubenswrapper[4749]: I1001 14:47:50.376022 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qtqrp/must-gather-686kh" podStartSLOduration=3.375997587 podStartE2EDuration="3.375997587s" podCreationTimestamp="2025-10-01 14:47:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:47:50.369843549 +0000 UTC m=+6130.423828478" watchObservedRunningTime="2025-10-01 14:47:50.375997587 +0000 UTC m=+6130.429982516" Oct 01 14:47:52 crc kubenswrapper[4749]: I1001 14:47:52.780068 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qtqrp/crc-debug-9xn72"] Oct 01 14:47:52 crc kubenswrapper[4749]: I1001 14:47:52.781626 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/crc-debug-9xn72" Oct 01 14:47:52 crc kubenswrapper[4749]: I1001 14:47:52.783319 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qtqrp"/"default-dockercfg-s7hph" Oct 01 14:47:52 crc kubenswrapper[4749]: I1001 14:47:52.949310 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klm2r\" (UniqueName: \"kubernetes.io/projected/82613388-f5f4-47e3-900e-7295d5ec16b8-kube-api-access-klm2r\") pod \"crc-debug-9xn72\" (UID: \"82613388-f5f4-47e3-900e-7295d5ec16b8\") " pod="openshift-must-gather-qtqrp/crc-debug-9xn72" Oct 01 14:47:52 crc kubenswrapper[4749]: I1001 14:47:52.949375 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82613388-f5f4-47e3-900e-7295d5ec16b8-host\") pod \"crc-debug-9xn72\" (UID: \"82613388-f5f4-47e3-900e-7295d5ec16b8\") " pod="openshift-must-gather-qtqrp/crc-debug-9xn72" Oct 01 14:47:53 crc kubenswrapper[4749]: I1001 14:47:53.051259 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klm2r\" (UniqueName: \"kubernetes.io/projected/82613388-f5f4-47e3-900e-7295d5ec16b8-kube-api-access-klm2r\") pod \"crc-debug-9xn72\" (UID: \"82613388-f5f4-47e3-900e-7295d5ec16b8\") " pod="openshift-must-gather-qtqrp/crc-debug-9xn72" Oct 01 14:47:53 crc kubenswrapper[4749]: I1001 14:47:53.051565 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82613388-f5f4-47e3-900e-7295d5ec16b8-host\") pod \"crc-debug-9xn72\" (UID: \"82613388-f5f4-47e3-900e-7295d5ec16b8\") " pod="openshift-must-gather-qtqrp/crc-debug-9xn72" Oct 01 14:47:53 crc kubenswrapper[4749]: I1001 14:47:53.051627 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82613388-f5f4-47e3-900e-7295d5ec16b8-host\") pod \"crc-debug-9xn72\" (UID: \"82613388-f5f4-47e3-900e-7295d5ec16b8\") " pod="openshift-must-gather-qtqrp/crc-debug-9xn72" Oct 01 14:47:53 crc kubenswrapper[4749]: I1001 14:47:53.070964 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klm2r\" (UniqueName: \"kubernetes.io/projected/82613388-f5f4-47e3-900e-7295d5ec16b8-kube-api-access-klm2r\") pod \"crc-debug-9xn72\" (UID: \"82613388-f5f4-47e3-900e-7295d5ec16b8\") " pod="openshift-must-gather-qtqrp/crc-debug-9xn72" Oct 01 14:47:53 crc kubenswrapper[4749]: I1001 14:47:53.103490 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/crc-debug-9xn72" Oct 01 14:47:53 crc kubenswrapper[4749]: I1001 14:47:53.383788 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qtqrp/crc-debug-9xn72" event={"ID":"82613388-f5f4-47e3-900e-7295d5ec16b8","Type":"ContainerStarted","Data":"93143623d1371c30113bfc5ecaaa1b18b33dd887f335a1b935dc5c4a4b4a80ff"} Oct 01 14:47:53 crc kubenswrapper[4749]: I1001 14:47:53.384534 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qtqrp/crc-debug-9xn72" event={"ID":"82613388-f5f4-47e3-900e-7295d5ec16b8","Type":"ContainerStarted","Data":"4499b3af6b955b6244895c3fba7e938ab89971a55aed2dabbf96b46967ac6a99"} Oct 01 14:47:53 crc kubenswrapper[4749]: I1001 14:47:53.399077 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qtqrp/crc-debug-9xn72" podStartSLOduration=1.399058207 podStartE2EDuration="1.399058207s" podCreationTimestamp="2025-10-01 14:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:47:53.39813687 +0000 UTC m=+6133.452121769" watchObservedRunningTime="2025-10-01 14:47:53.399058207 +0000 UTC m=+6133.453043106" Oct 01 14:48:02 crc kubenswrapper[4749]: I1001 14:48:02.106491 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:48:02 crc kubenswrapper[4749]: I1001 14:48:02.107061 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:48:11 crc kubenswrapper[4749]: I1001 14:48:11.812734 4749 scope.go:117] "RemoveContainer" containerID="7480aed2048c05b8a28a51b2ee3358c7a52ed2d7b02491e36218a3cfdd038cc0" Oct 01 14:48:11 crc kubenswrapper[4749]: I1001 14:48:11.848358 4749 scope.go:117] "RemoveContainer" containerID="8fcb2d37fdc00caecad3ba3efbd05b2056b53f09aaedab1758199d95a68b69ac" Oct 01 14:48:32 crc kubenswrapper[4749]: I1001 14:48:32.106557 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:48:32 crc kubenswrapper[4749]: I1001 14:48:32.107104 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:48:32 crc kubenswrapper[4749]: I1001 14:48:32.107148 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 14:48:32 crc kubenswrapper[4749]: I1001 14:48:32.107928 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d4cf5263ca8bafbd9edc3554b2592b9589b7975d8a139c3dfc769d7079e44546"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:48:32 crc kubenswrapper[4749]: I1001 14:48:32.107997 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://d4cf5263ca8bafbd9edc3554b2592b9589b7975d8a139c3dfc769d7079e44546" gracePeriod=600 Oct 01 14:48:32 crc kubenswrapper[4749]: I1001 14:48:32.793056 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="d4cf5263ca8bafbd9edc3554b2592b9589b7975d8a139c3dfc769d7079e44546" exitCode=0 Oct 01 14:48:32 crc kubenswrapper[4749]: I1001 14:48:32.793112 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"d4cf5263ca8bafbd9edc3554b2592b9589b7975d8a139c3dfc769d7079e44546"} Oct 01 14:48:32 crc kubenswrapper[4749]: I1001 14:48:32.793551 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerStarted","Data":"a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e"} Oct 01 14:48:32 crc kubenswrapper[4749]: I1001 14:48:32.793569 4749 scope.go:117] "RemoveContainer" containerID="7c43b6352ed12e231360621583b06a589df9b513f102b2123dee56ada745b7a5" Oct 01 14:49:11 crc kubenswrapper[4749]: I1001 14:49:11.986723 4749 scope.go:117] "RemoveContainer" containerID="e6eac4604964ac43b8e09d373b25d7e9ec8ee77689274a9400de0dce872874f9" Oct 01 14:49:13 crc kubenswrapper[4749]: I1001 14:49:13.548179 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-677dbd476-b92fx_0e51f972-9f26-4b6b-8213-9261797a1ee0/barbican-api/0.log" Oct 01 14:49:13 crc kubenswrapper[4749]: I1001 14:49:13.563428 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-677dbd476-b92fx_0e51f972-9f26-4b6b-8213-9261797a1ee0/barbican-api-log/0.log" Oct 01 14:49:13 crc kubenswrapper[4749]: I1001 14:49:13.848354 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5fdf7b5778-vxx8p_a27e333f-57a3-4257-9e49-e03928cfa02d/barbican-keystone-listener/0.log" Oct 01 14:49:14 crc kubenswrapper[4749]: I1001 14:49:14.039465 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5fdf7b5778-vxx8p_a27e333f-57a3-4257-9e49-e03928cfa02d/barbican-keystone-listener-log/0.log" Oct 01 14:49:14 crc kubenswrapper[4749]: I1001 14:49:14.145463 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5c6dfb5585-x78z5_c8b1d3a9-044c-475f-b86f-7e099e2b1197/barbican-worker/0.log" Oct 01 14:49:14 crc kubenswrapper[4749]: I1001 14:49:14.275131 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5c6dfb5585-x78z5_c8b1d3a9-044c-475f-b86f-7e099e2b1197/barbican-worker-log/0.log" Oct 01 14:49:14 crc kubenswrapper[4749]: I1001 14:49:14.410272 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zb56w_36105d3f-3305-4cd8-9b9c-4b3d7eaec504/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:14 crc kubenswrapper[4749]: I1001 14:49:14.663740 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8443de13-c4a9-420c-a4ff-5aa54d222850/ceilometer-central-agent/0.log" Oct 01 14:49:14 crc kubenswrapper[4749]: I1001 14:49:14.683144 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8443de13-c4a9-420c-a4ff-5aa54d222850/ceilometer-notification-agent/0.log" Oct 01 14:49:14 crc kubenswrapper[4749]: I1001 14:49:14.781092 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8443de13-c4a9-420c-a4ff-5aa54d222850/proxy-httpd/0.log" Oct 01 14:49:14 crc kubenswrapper[4749]: I1001 14:49:14.882031 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_8443de13-c4a9-420c-a4ff-5aa54d222850/sg-core/0.log" Oct 01 14:49:15 crc kubenswrapper[4749]: I1001 14:49:15.115733 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_468beb78-1358-4a1b-ad2c-3941f3f270c6/cinder-api-log/0.log" Oct 01 14:49:15 crc kubenswrapper[4749]: I1001 14:49:15.326996 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_468beb78-1358-4a1b-ad2c-3941f3f270c6/cinder-api/0.log" Oct 01 14:49:15 crc kubenswrapper[4749]: I1001 14:49:15.421450 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_005bd9d5-4799-4763-aa6b-46a9341c36d2/cinder-scheduler/0.log" Oct 01 14:49:15 crc kubenswrapper[4749]: I1001 14:49:15.509693 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_005bd9d5-4799-4763-aa6b-46a9341c36d2/probe/0.log" Oct 01 14:49:15 crc kubenswrapper[4749]: I1001 14:49:15.658248 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-cs8kx_f382a017-d5fe-45d9-ad7b-f9316dbd5834/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:15 crc kubenswrapper[4749]: I1001 14:49:15.792778 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-n5r4t_45b8016b-ecf1-4187-98eb-daf846021c8c/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:15 crc kubenswrapper[4749]: I1001 14:49:15.957215 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-q4scf_753c7ac8-1ca7-4787-af3b-87553f59bc9f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:16 crc kubenswrapper[4749]: I1001 14:49:16.113132 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6ff49d554c-jx4l4_1c268c09-ae8f-49b8-916f-b5ce032bfaf1/init/0.log" Oct 01 14:49:16 crc kubenswrapper[4749]: I1001 14:49:16.271678 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6ff49d554c-jx4l4_1c268c09-ae8f-49b8-916f-b5ce032bfaf1/init/0.log" Oct 01 14:49:16 crc kubenswrapper[4749]: I1001 14:49:16.451484 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6ff49d554c-jx4l4_1c268c09-ae8f-49b8-916f-b5ce032bfaf1/dnsmasq-dns/0.log" Oct 01 14:49:16 crc kubenswrapper[4749]: I1001 14:49:16.637015 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-nkvkn_505c57d6-8e3e-469e-b7ed-c15bdff56519/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:16 crc kubenswrapper[4749]: I1001 14:49:16.698901 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bc78bfd1-472f-4d64-b48c-7b986bee129a/glance-httpd/0.log" Oct 01 14:49:16 crc kubenswrapper[4749]: I1001 14:49:16.842172 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bc78bfd1-472f-4d64-b48c-7b986bee129a/glance-log/0.log" Oct 01 14:49:16 crc kubenswrapper[4749]: I1001 14:49:16.960507 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_495b97d9-1d27-4e6e-a857-ee6cfdf6dffa/glance-httpd/0.log" Oct 01 14:49:17 crc kubenswrapper[4749]: I1001 14:49:17.090779 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_495b97d9-1d27-4e6e-a857-ee6cfdf6dffa/glance-log/0.log" Oct 01 14:49:17 crc kubenswrapper[4749]: I1001 14:49:17.305383 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b74b5b846-r84t7_e22321e2-ded2-4732-ac89-f9f0d4dcd199/horizon/0.log" Oct 01 14:49:17 crc kubenswrapper[4749]: I1001 14:49:17.455888 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-bpgp8_c51108bd-9132-43bf-ac9b-61a8284dc289/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:17 crc kubenswrapper[4749]: I1001 14:49:17.611832 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mmvsf_9ef2bd67-60d1-4f4b-893c-f7e22430addd/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:17 crc kubenswrapper[4749]: I1001 14:49:17.780510 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-b74b5b846-r84t7_e22321e2-ded2-4732-ac89-f9f0d4dcd199/horizon-log/0.log" Oct 01 14:49:17 crc kubenswrapper[4749]: I1001 14:49:17.984788 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29322121-ksm5l_9f1c1f6f-c5a5-499c-874f-245d4d918274/keystone-cron/0.log" Oct 01 14:49:18 crc kubenswrapper[4749]: I1001 14:49:18.215447 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4b21eff6-e2ad-4c02-9558-0346ff822f46/kube-state-metrics/0.log" Oct 01 14:49:18 crc kubenswrapper[4749]: I1001 14:49:18.402916 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-bksd6_f2871c6b-b170-4396-8c0b-be0ac02c1b48/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:18 crc kubenswrapper[4749]: I1001 14:49:18.415978 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-658b64fcb-k2w2c_5b73cc62-6695-480f-90cc-8d1f4b5993b3/keystone-api/0.log" Oct 01 14:49:18 crc kubenswrapper[4749]: I1001 14:49:18.972090 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-59d7c7544c-n7mlp_680ec9d6-ccd3-4417-9919-7412600f23fb/neutron-httpd/0.log" Oct 01 14:49:19 crc kubenswrapper[4749]: I1001 14:49:19.065857 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-59d7c7544c-n7mlp_680ec9d6-ccd3-4417-9919-7412600f23fb/neutron-api/0.log" Oct 01 14:49:19 crc kubenswrapper[4749]: I1001 14:49:19.113607 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gg528_9f01f729-fe3b-4f70-89c9-4398f80160e7/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:19 crc kubenswrapper[4749]: I1001 14:49:19.310010 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_6e8c8fb4-1126-4663-b4d5-f4a2c5a11f09/memcached/0.log" Oct 01 14:49:19 crc kubenswrapper[4749]: I1001 14:49:19.839501 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e5bf3ac0-8a0f-4c98-88ea-a55523c4b59c/nova-cell0-conductor-conductor/0.log" Oct 01 14:49:20 crc kubenswrapper[4749]: I1001 14:49:20.038247 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_2fe9d22f-227f-4f5a-8c9c-fc50845af518/nova-cell1-conductor-conductor/0.log" Oct 01 14:49:20 crc kubenswrapper[4749]: I1001 14:49:20.448633 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b4293a98-bf1d-47e9-9c16-e272e6c836f7/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 14:49:20 crc kubenswrapper[4749]: I1001 14:49:20.694264 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_90086f57-f0d4-4a80-9606-d225410b66e2/nova-api-log/0.log" Oct 01 14:49:20 crc kubenswrapper[4749]: I1001 14:49:20.695718 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-n4plg_848e191d-2e82-41af-8368-7c9c7e7b200e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:20 crc kubenswrapper[4749]: I1001 14:49:20.985500 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2c53c696-a24d-4024-86dc-2ce22e1a2e8e/nova-metadata-log/0.log" Oct 01 14:49:21 crc kubenswrapper[4749]: I1001 14:49:21.025103 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_90086f57-f0d4-4a80-9606-d225410b66e2/nova-api-api/0.log" Oct 01 14:49:21 crc kubenswrapper[4749]: I1001 14:49:21.307104 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b75b73e1-aac4-41c8-9ad7-afe216cf9741/mysql-bootstrap/0.log" Oct 01 14:49:21 crc kubenswrapper[4749]: I1001 14:49:21.462812 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_1097258f-f21a-4b28-935a-d7dea1d508dd/nova-scheduler-scheduler/0.log" Oct 01 14:49:21 crc kubenswrapper[4749]: I1001 14:49:21.515495 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b75b73e1-aac4-41c8-9ad7-afe216cf9741/mysql-bootstrap/0.log" Oct 01 14:49:21 crc kubenswrapper[4749]: I1001 14:49:21.525761 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_b75b73e1-aac4-41c8-9ad7-afe216cf9741/galera/0.log" Oct 01 14:49:21 crc kubenswrapper[4749]: I1001 14:49:21.947869 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_215f6c11-74c0-4e5e-a39d-8af23dd5e4af/mysql-bootstrap/0.log" Oct 01 14:49:22 crc kubenswrapper[4749]: I1001 14:49:22.060540 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_215f6c11-74c0-4e5e-a39d-8af23dd5e4af/mysql-bootstrap/0.log" Oct 01 14:49:22 crc kubenswrapper[4749]: I1001 14:49:22.147397 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_215f6c11-74c0-4e5e-a39d-8af23dd5e4af/galera/0.log" Oct 01 14:49:22 crc kubenswrapper[4749]: I1001 14:49:22.304069 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_b55ebe69-1518-428b-9ceb-383de60316cc/openstackclient/0.log" Oct 01 14:49:22 crc kubenswrapper[4749]: I1001 14:49:22.402359 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-jfv7g_2f80f3fb-dca2-4aaa-a29e-8a7126b5bfca/openstack-network-exporter/0.log" Oct 01 14:49:22 crc kubenswrapper[4749]: I1001 14:49:22.594268 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-75t94_45d86d96-b332-498e-a952-c34007c2f07b/ovsdb-server-init/0.log" Oct 01 14:49:22 crc kubenswrapper[4749]: I1001 14:49:22.595944 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2c53c696-a24d-4024-86dc-2ce22e1a2e8e/nova-metadata-metadata/0.log" Oct 01 14:49:22 crc kubenswrapper[4749]: I1001 14:49:22.876761 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-75t94_45d86d96-b332-498e-a952-c34007c2f07b/ovsdb-server/0.log" Oct 01 14:49:22 crc kubenswrapper[4749]: I1001 14:49:22.889675 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-75t94_45d86d96-b332-498e-a952-c34007c2f07b/ovsdb-server-init/0.log" Oct 01 14:49:23 crc kubenswrapper[4749]: I1001 14:49:23.005412 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-75t94_45d86d96-b332-498e-a952-c34007c2f07b/ovs-vswitchd/0.log" Oct 01 14:49:23 crc kubenswrapper[4749]: I1001 14:49:23.054968 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-sl4xv_4035d0d3-eeec-429f-b31e-ab4649ecf92a/ovn-controller/0.log" Oct 01 14:49:23 crc kubenswrapper[4749]: I1001 14:49:23.114659 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-x8tkd_4b848b5a-f3c5-438c-a481-f06d07d4273a/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:23 crc kubenswrapper[4749]: I1001 14:49:23.308656 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_78021fea-966d-45d2-8816-265437360e8f/openstack-network-exporter/0.log" Oct 01 14:49:23 crc kubenswrapper[4749]: I1001 14:49:23.308783 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_78021fea-966d-45d2-8816-265437360e8f/ovn-northd/0.log" Oct 01 14:49:23 crc kubenswrapper[4749]: I1001 14:49:23.436846 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eb007c1f-6b53-4c8a-9921-85ccd3d5dad5/openstack-network-exporter/0.log" Oct 01 14:49:23 crc kubenswrapper[4749]: I1001 14:49:23.500845 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eb007c1f-6b53-4c8a-9921-85ccd3d5dad5/ovsdbserver-nb/0.log" Oct 01 14:49:23 crc kubenswrapper[4749]: I1001 14:49:23.600637 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_854288a3-bb59-4721-b1ac-059920cd8c30/openstack-network-exporter/0.log" Oct 01 14:49:23 crc kubenswrapper[4749]: I1001 14:49:23.632505 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_854288a3-bb59-4721-b1ac-059920cd8c30/ovsdbserver-sb/0.log" Oct 01 14:49:23 crc kubenswrapper[4749]: I1001 14:49:23.805469 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-ffbb6dc5b-8kwbn_5d400dce-67f7-4e74-b2b9-85f0302a3e43/placement-api/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.010550 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6877d0fa-8236-4975-af20-88d438464469/init-config-reloader/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.013301 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-ffbb6dc5b-8kwbn_5d400dce-67f7-4e74-b2b9-85f0302a3e43/placement-log/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.151899 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6877d0fa-8236-4975-af20-88d438464469/init-config-reloader/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.195116 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6877d0fa-8236-4975-af20-88d438464469/prometheus/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.204696 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6877d0fa-8236-4975-af20-88d438464469/config-reloader/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.223270 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6877d0fa-8236-4975-af20-88d438464469/thanos-sidecar/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.391762 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7/setup-container/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.589014 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7/setup-container/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.618510 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_811135e0-fdbb-4e6e-bd9f-13d54ba7f4f7/rabbitmq/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.645932 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1d655140-d63d-4e40-8de1-875213f37d4a/setup-container/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.826967 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1d655140-d63d-4e40-8de1-875213f37d4a/setup-container/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.877333 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_1d655140-d63d-4e40-8de1-875213f37d4a/rabbitmq/0.log" Oct 01 14:49:24 crc kubenswrapper[4749]: I1001 14:49:24.946013 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5b5aa915-bf5a-4046-834c-6051ed420f42/setup-container/0.log" Oct 01 14:49:25 crc kubenswrapper[4749]: I1001 14:49:25.170612 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5b5aa915-bf5a-4046-834c-6051ed420f42/setup-container/0.log" Oct 01 14:49:25 crc kubenswrapper[4749]: I1001 14:49:25.184876 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-c75vt_9b4bd5b0-38c2-416c-aba3-9a0522807502/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:25 crc kubenswrapper[4749]: I1001 14:49:25.191675 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5b5aa915-bf5a-4046-834c-6051ed420f42/rabbitmq/0.log" Oct 01 14:49:25 crc kubenswrapper[4749]: I1001 14:49:25.564362 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zl4wx_ae065bff-2fba-4e8b-a734-75cd8b9d1a26/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:25 crc kubenswrapper[4749]: I1001 14:49:25.604312 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-n659n_ef750054-fd5c-408e-bd33-90e1a43d8a86/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:25 crc kubenswrapper[4749]: I1001 14:49:25.790141 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-r4gm5_99ed982c-1039-47b4-b8f8-fcc9d06e636d/ssh-known-hosts-edpm-deployment/0.log" Oct 01 14:49:25 crc kubenswrapper[4749]: I1001 14:49:25.794936 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-s59r6_e51631e2-8bb9-4f43-958a-a3475d800d61/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:25 crc kubenswrapper[4749]: I1001 14:49:25.979464 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85b88cb8b7-2gzx9_6139ffc4-c70f-45d5-aa79-6fc7b79f2034/proxy-server/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.144197 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-85b88cb8b7-2gzx9_6139ffc4-c70f-45d5-aa79-6fc7b79f2034/proxy-httpd/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.281317 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-h6gm9_c0804583-6f4e-48e5-99f5-eaee2844191d/swift-ring-rebalance/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.285079 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/account-auditor/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.359166 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/account-reaper/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.510852 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/account-server/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.518766 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/account-replicator/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.561011 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/container-auditor/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.610357 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/container-replicator/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.688407 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/container-server/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.761606 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/container-updater/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.771419 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/object-auditor/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.806067 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/object-expirer/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.911930 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/object-replicator/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.948835 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/object-server/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.964894 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/object-updater/0.log" Oct 01 14:49:26 crc kubenswrapper[4749]: I1001 14:49:26.996280 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/rsync/0.log" Oct 01 14:49:27 crc kubenswrapper[4749]: I1001 14:49:27.084779 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_6ef9f56d-2299-424f-9cc3-21cd7fcae8c1/swift-recon-cron/0.log" Oct 01 14:49:27 crc kubenswrapper[4749]: I1001 14:49:27.188847 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-2d7x6_a74a77b0-6409-400a-a75c-115e2b2cba85/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:27 crc kubenswrapper[4749]: I1001 14:49:27.321262 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_fbe56db7-126d-4ce2-a7e6-ebd281fb8b6d/tempest-tests-tempest-tests-runner/0.log" Oct 01 14:49:27 crc kubenswrapper[4749]: I1001 14:49:27.346157 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_1390b811-1714-4e90-9491-e38ba4b0a530/test-operator-logs-container/0.log" Oct 01 14:49:27 crc kubenswrapper[4749]: I1001 14:49:27.521995 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-w88x7_36d63e13-8131-47f4-a65a-a78db593d3bf/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 14:49:28 crc kubenswrapper[4749]: I1001 14:49:28.350232 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_60860f07-ba03-4dfb-bb91-2bd68232bc90/watcher-applier/0.log" Oct 01 14:49:28 crc kubenswrapper[4749]: I1001 14:49:28.607350 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_3f0334e5-add1-4ced-bad4-7e77d528e28a/watcher-decision-engine/0.log" Oct 01 14:49:28 crc kubenswrapper[4749]: I1001 14:49:28.854498 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_696adfa9-0326-4d60-8a2d-c53ee267a249/watcher-api-log/0.log" Oct 01 14:49:31 crc kubenswrapper[4749]: I1001 14:49:31.216372 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_3f0334e5-add1-4ced-bad4-7e77d528e28a/watcher-decision-engine/1.log" Oct 01 14:49:32 crc kubenswrapper[4749]: I1001 14:49:32.021779 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_696adfa9-0326-4d60-8a2d-c53ee267a249/watcher-api/0.log" Oct 01 14:49:52 crc kubenswrapper[4749]: I1001 14:49:52.631139 4749 generic.go:334] "Generic (PLEG): container finished" podID="82613388-f5f4-47e3-900e-7295d5ec16b8" containerID="93143623d1371c30113bfc5ecaaa1b18b33dd887f335a1b935dc5c4a4b4a80ff" exitCode=0 Oct 01 14:49:52 crc kubenswrapper[4749]: I1001 14:49:52.631278 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qtqrp/crc-debug-9xn72" event={"ID":"82613388-f5f4-47e3-900e-7295d5ec16b8","Type":"ContainerDied","Data":"93143623d1371c30113bfc5ecaaa1b18b33dd887f335a1b935dc5c4a4b4a80ff"} Oct 01 14:49:53 crc kubenswrapper[4749]: I1001 14:49:53.808090 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/crc-debug-9xn72" Oct 01 14:49:53 crc kubenswrapper[4749]: I1001 14:49:53.849164 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qtqrp/crc-debug-9xn72"] Oct 01 14:49:53 crc kubenswrapper[4749]: I1001 14:49:53.863832 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qtqrp/crc-debug-9xn72"] Oct 01 14:49:53 crc kubenswrapper[4749]: I1001 14:49:53.946229 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klm2r\" (UniqueName: \"kubernetes.io/projected/82613388-f5f4-47e3-900e-7295d5ec16b8-kube-api-access-klm2r\") pod \"82613388-f5f4-47e3-900e-7295d5ec16b8\" (UID: \"82613388-f5f4-47e3-900e-7295d5ec16b8\") " Oct 01 14:49:53 crc kubenswrapper[4749]: I1001 14:49:53.946652 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82613388-f5f4-47e3-900e-7295d5ec16b8-host\") pod \"82613388-f5f4-47e3-900e-7295d5ec16b8\" (UID: \"82613388-f5f4-47e3-900e-7295d5ec16b8\") " Oct 01 14:49:53 crc kubenswrapper[4749]: I1001 14:49:53.946777 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82613388-f5f4-47e3-900e-7295d5ec16b8-host" (OuterVolumeSpecName: "host") pod "82613388-f5f4-47e3-900e-7295d5ec16b8" (UID: "82613388-f5f4-47e3-900e-7295d5ec16b8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:49:53 crc kubenswrapper[4749]: I1001 14:49:53.947348 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82613388-f5f4-47e3-900e-7295d5ec16b8-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:49:53 crc kubenswrapper[4749]: I1001 14:49:53.951567 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82613388-f5f4-47e3-900e-7295d5ec16b8-kube-api-access-klm2r" (OuterVolumeSpecName: "kube-api-access-klm2r") pod "82613388-f5f4-47e3-900e-7295d5ec16b8" (UID: "82613388-f5f4-47e3-900e-7295d5ec16b8"). InnerVolumeSpecName "kube-api-access-klm2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:49:54 crc kubenswrapper[4749]: I1001 14:49:54.049660 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klm2r\" (UniqueName: \"kubernetes.io/projected/82613388-f5f4-47e3-900e-7295d5ec16b8-kube-api-access-klm2r\") on node \"crc\" DevicePath \"\"" Oct 01 14:49:54 crc kubenswrapper[4749]: I1001 14:49:54.655407 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4499b3af6b955b6244895c3fba7e938ab89971a55aed2dabbf96b46967ac6a99" Oct 01 14:49:54 crc kubenswrapper[4749]: I1001 14:49:54.655482 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/crc-debug-9xn72" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.049143 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qtqrp/crc-debug-schtk"] Oct 01 14:49:55 crc kubenswrapper[4749]: E1001 14:49:55.050779 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82613388-f5f4-47e3-900e-7295d5ec16b8" containerName="container-00" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.050949 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="82613388-f5f4-47e3-900e-7295d5ec16b8" containerName="container-00" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.051201 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="82613388-f5f4-47e3-900e-7295d5ec16b8" containerName="container-00" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.051971 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/crc-debug-schtk" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.055677 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qtqrp"/"default-dockercfg-s7hph" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.171847 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvzhm\" (UniqueName: \"kubernetes.io/projected/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c-kube-api-access-vvzhm\") pod \"crc-debug-schtk\" (UID: \"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c\") " pod="openshift-must-gather-qtqrp/crc-debug-schtk" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.172151 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c-host\") pod \"crc-debug-schtk\" (UID: \"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c\") " pod="openshift-must-gather-qtqrp/crc-debug-schtk" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.250987 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82613388-f5f4-47e3-900e-7295d5ec16b8" path="/var/lib/kubelet/pods/82613388-f5f4-47e3-900e-7295d5ec16b8/volumes" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.274733 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c-host\") pod \"crc-debug-schtk\" (UID: \"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c\") " pod="openshift-must-gather-qtqrp/crc-debug-schtk" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.274895 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c-host\") pod \"crc-debug-schtk\" (UID: \"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c\") " pod="openshift-must-gather-qtqrp/crc-debug-schtk" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.274956 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvzhm\" (UniqueName: \"kubernetes.io/projected/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c-kube-api-access-vvzhm\") pod \"crc-debug-schtk\" (UID: \"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c\") " pod="openshift-must-gather-qtqrp/crc-debug-schtk" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.295323 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvzhm\" (UniqueName: \"kubernetes.io/projected/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c-kube-api-access-vvzhm\") pod \"crc-debug-schtk\" (UID: \"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c\") " pod="openshift-must-gather-qtqrp/crc-debug-schtk" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.370179 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/crc-debug-schtk" Oct 01 14:49:55 crc kubenswrapper[4749]: I1001 14:49:55.683916 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qtqrp/crc-debug-schtk" event={"ID":"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c","Type":"ContainerStarted","Data":"6b951e31f3bd0572172d46bc761136da22e04586a7b9bd86a4acc8a50e7cba31"} Oct 01 14:49:56 crc kubenswrapper[4749]: I1001 14:49:56.697807 4749 generic.go:334] "Generic (PLEG): container finished" podID="17398fb0-1c9c-45b4-80fc-fc75ecbfa92c" containerID="918c375ba705412d84922f5e10caf2af72f3cf699f9bd2382064529311ad19d2" exitCode=0 Oct 01 14:49:56 crc kubenswrapper[4749]: I1001 14:49:56.697861 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qtqrp/crc-debug-schtk" event={"ID":"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c","Type":"ContainerDied","Data":"918c375ba705412d84922f5e10caf2af72f3cf699f9bd2382064529311ad19d2"} Oct 01 14:49:57 crc kubenswrapper[4749]: I1001 14:49:57.812812 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/crc-debug-schtk" Oct 01 14:49:57 crc kubenswrapper[4749]: I1001 14:49:57.925734 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvzhm\" (UniqueName: \"kubernetes.io/projected/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c-kube-api-access-vvzhm\") pod \"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c\" (UID: \"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c\") " Oct 01 14:49:57 crc kubenswrapper[4749]: I1001 14:49:57.925845 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c-host\") pod \"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c\" (UID: \"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c\") " Oct 01 14:49:57 crc kubenswrapper[4749]: I1001 14:49:57.925938 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c-host" (OuterVolumeSpecName: "host") pod "17398fb0-1c9c-45b4-80fc-fc75ecbfa92c" (UID: "17398fb0-1c9c-45b4-80fc-fc75ecbfa92c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:49:57 crc kubenswrapper[4749]: I1001 14:49:57.926508 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:49:57 crc kubenswrapper[4749]: I1001 14:49:57.937733 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c-kube-api-access-vvzhm" (OuterVolumeSpecName: "kube-api-access-vvzhm") pod "17398fb0-1c9c-45b4-80fc-fc75ecbfa92c" (UID: "17398fb0-1c9c-45b4-80fc-fc75ecbfa92c"). InnerVolumeSpecName "kube-api-access-vvzhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:49:58 crc kubenswrapper[4749]: I1001 14:49:58.027704 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvzhm\" (UniqueName: \"kubernetes.io/projected/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c-kube-api-access-vvzhm\") on node \"crc\" DevicePath \"\"" Oct 01 14:49:58 crc kubenswrapper[4749]: I1001 14:49:58.716952 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qtqrp/crc-debug-schtk" event={"ID":"17398fb0-1c9c-45b4-80fc-fc75ecbfa92c","Type":"ContainerDied","Data":"6b951e31f3bd0572172d46bc761136da22e04586a7b9bd86a4acc8a50e7cba31"} Oct 01 14:49:58 crc kubenswrapper[4749]: I1001 14:49:58.716999 4749 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b951e31f3bd0572172d46bc761136da22e04586a7b9bd86a4acc8a50e7cba31" Oct 01 14:49:58 crc kubenswrapper[4749]: I1001 14:49:58.717031 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/crc-debug-schtk" Oct 01 14:50:05 crc kubenswrapper[4749]: I1001 14:50:05.964024 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qtqrp/crc-debug-schtk"] Oct 01 14:50:05 crc kubenswrapper[4749]: I1001 14:50:05.971641 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qtqrp/crc-debug-schtk"] Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.179660 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qtqrp/crc-debug-bfv4j"] Oct 01 14:50:07 crc kubenswrapper[4749]: E1001 14:50:07.180517 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17398fb0-1c9c-45b4-80fc-fc75ecbfa92c" containerName="container-00" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.180529 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="17398fb0-1c9c-45b4-80fc-fc75ecbfa92c" containerName="container-00" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.180730 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="17398fb0-1c9c-45b4-80fc-fc75ecbfa92c" containerName="container-00" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.181384 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/crc-debug-bfv4j" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.183645 4749 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qtqrp"/"default-dockercfg-s7hph" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.258605 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17398fb0-1c9c-45b4-80fc-fc75ecbfa92c" path="/var/lib/kubelet/pods/17398fb0-1c9c-45b4-80fc-fc75ecbfa92c/volumes" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.296542 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65e76968-7010-4e3c-968e-f86d3aa08b21-host\") pod \"crc-debug-bfv4j\" (UID: \"65e76968-7010-4e3c-968e-f86d3aa08b21\") " pod="openshift-must-gather-qtqrp/crc-debug-bfv4j" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.296870 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4cmb\" (UniqueName: \"kubernetes.io/projected/65e76968-7010-4e3c-968e-f86d3aa08b21-kube-api-access-b4cmb\") pod \"crc-debug-bfv4j\" (UID: \"65e76968-7010-4e3c-968e-f86d3aa08b21\") " pod="openshift-must-gather-qtqrp/crc-debug-bfv4j" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.398569 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4cmb\" (UniqueName: \"kubernetes.io/projected/65e76968-7010-4e3c-968e-f86d3aa08b21-kube-api-access-b4cmb\") pod \"crc-debug-bfv4j\" (UID: \"65e76968-7010-4e3c-968e-f86d3aa08b21\") " pod="openshift-must-gather-qtqrp/crc-debug-bfv4j" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.398745 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65e76968-7010-4e3c-968e-f86d3aa08b21-host\") pod \"crc-debug-bfv4j\" (UID: \"65e76968-7010-4e3c-968e-f86d3aa08b21\") " pod="openshift-must-gather-qtqrp/crc-debug-bfv4j" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.398915 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65e76968-7010-4e3c-968e-f86d3aa08b21-host\") pod \"crc-debug-bfv4j\" (UID: \"65e76968-7010-4e3c-968e-f86d3aa08b21\") " pod="openshift-must-gather-qtqrp/crc-debug-bfv4j" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.424866 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4cmb\" (UniqueName: \"kubernetes.io/projected/65e76968-7010-4e3c-968e-f86d3aa08b21-kube-api-access-b4cmb\") pod \"crc-debug-bfv4j\" (UID: \"65e76968-7010-4e3c-968e-f86d3aa08b21\") " pod="openshift-must-gather-qtqrp/crc-debug-bfv4j" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.510257 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/crc-debug-bfv4j" Oct 01 14:50:07 crc kubenswrapper[4749]: I1001 14:50:07.801329 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qtqrp/crc-debug-bfv4j" event={"ID":"65e76968-7010-4e3c-968e-f86d3aa08b21","Type":"ContainerStarted","Data":"1ca5fc7d3ba3d53d1d98a5e2db76dacf66a08c9cb688b15f4b998b81fcb21fbd"} Oct 01 14:50:08 crc kubenswrapper[4749]: I1001 14:50:08.818546 4749 generic.go:334] "Generic (PLEG): container finished" podID="65e76968-7010-4e3c-968e-f86d3aa08b21" containerID="df26a7329a6762cd8812ac6b481472f35b933638b96e5a7f1f924d11a282cd09" exitCode=0 Oct 01 14:50:08 crc kubenswrapper[4749]: I1001 14:50:08.818670 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qtqrp/crc-debug-bfv4j" event={"ID":"65e76968-7010-4e3c-968e-f86d3aa08b21","Type":"ContainerDied","Data":"df26a7329a6762cd8812ac6b481472f35b933638b96e5a7f1f924d11a282cd09"} Oct 01 14:50:08 crc kubenswrapper[4749]: I1001 14:50:08.869844 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qtqrp/crc-debug-bfv4j"] Oct 01 14:50:08 crc kubenswrapper[4749]: I1001 14:50:08.881691 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qtqrp/crc-debug-bfv4j"] Oct 01 14:50:09 crc kubenswrapper[4749]: I1001 14:50:09.954844 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/crc-debug-bfv4j" Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.059657 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65e76968-7010-4e3c-968e-f86d3aa08b21-host\") pod \"65e76968-7010-4e3c-968e-f86d3aa08b21\" (UID: \"65e76968-7010-4e3c-968e-f86d3aa08b21\") " Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.060154 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4cmb\" (UniqueName: \"kubernetes.io/projected/65e76968-7010-4e3c-968e-f86d3aa08b21-kube-api-access-b4cmb\") pod \"65e76968-7010-4e3c-968e-f86d3aa08b21\" (UID: \"65e76968-7010-4e3c-968e-f86d3aa08b21\") " Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.059762 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65e76968-7010-4e3c-968e-f86d3aa08b21-host" (OuterVolumeSpecName: "host") pod "65e76968-7010-4e3c-968e-f86d3aa08b21" (UID: "65e76968-7010-4e3c-968e-f86d3aa08b21"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.065664 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e76968-7010-4e3c-968e-f86d3aa08b21-kube-api-access-b4cmb" (OuterVolumeSpecName: "kube-api-access-b4cmb") pod "65e76968-7010-4e3c-968e-f86d3aa08b21" (UID: "65e76968-7010-4e3c-968e-f86d3aa08b21"). InnerVolumeSpecName "kube-api-access-b4cmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.162726 4749 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/65e76968-7010-4e3c-968e-f86d3aa08b21-host\") on node \"crc\" DevicePath \"\"" Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.162764 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4cmb\" (UniqueName: \"kubernetes.io/projected/65e76968-7010-4e3c-968e-f86d3aa08b21-kube-api-access-b4cmb\") on node \"crc\" DevicePath \"\"" Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.604908 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-vvxcs_99a6dbcc-0b05-4471-b2d4-acacf72f6ff0/kube-rbac-proxy/0.log" Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.610426 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-vvxcs_99a6dbcc-0b05-4471-b2d4-acacf72f6ff0/manager/0.log" Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.758262 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-58njn_4a7c1ef4-c125-445b-9f1e-b24ee27e2938/kube-rbac-proxy/0.log" Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.803796 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-58njn_4a7c1ef4-c125-445b-9f1e-b24ee27e2938/manager/0.log" Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.840199 4749 scope.go:117] "RemoveContainer" containerID="df26a7329a6762cd8812ac6b481472f35b933638b96e5a7f1f924d11a282cd09" Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.840359 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/crc-debug-bfv4j" Oct 01 14:50:10 crc kubenswrapper[4749]: I1001 14:50:10.932063 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-ck4dv_0418f548-554a-4efc-8494-4edc9d56fc7f/kube-rbac-proxy/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:10.971075 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-ck4dv_0418f548-554a-4efc-8494-4edc9d56fc7f/manager/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.020312 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/util/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.193504 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/util/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.207005 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/pull/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.210787 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/pull/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.240626 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e76968-7010-4e3c-968e-f86d3aa08b21" path="/var/lib/kubelet/pods/65e76968-7010-4e3c-968e-f86d3aa08b21/volumes" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.391156 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/util/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.429505 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/extract/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.461412 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f5c9bb9626039ee642dc317bc974780ff848016b3a9f0bb36c07968cfeh5xmz_4ae96a8d-d561-4e2d-a16d-59dec09d2d98/pull/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.584967 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-6w5m5_6647f74c-8bbb-490d-ade7-8b2fb5469ddc/kube-rbac-proxy/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.675566 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-6w5m5_6647f74c-8bbb-490d-ade7-8b2fb5469ddc/manager/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.721351 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-h22cx_11409643-1cee-49c5-b3d0-fa1ec4cb1af0/kube-rbac-proxy/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.773128 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-h22cx_11409643-1cee-49c5-b3d0-fa1ec4cb1af0/manager/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.906506 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-4gdqf_02d25e12-ca12-40f3-bc21-2b5a55fdba5d/kube-rbac-proxy/0.log" Oct 01 14:50:11 crc kubenswrapper[4749]: I1001 14:50:11.910323 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-4gdqf_02d25e12-ca12-40f3-bc21-2b5a55fdba5d/manager/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.065381 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-xk2zl_8f20ab81-68d3-4973-9336-d00440b811f9/kube-rbac-proxy/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.234926 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-bhn8n_40a29698-620f-45d7-b630-0cfe188dd09f/kube-rbac-proxy/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.262028 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-xk2zl_8f20ab81-68d3-4973-9336-d00440b811f9/manager/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.289401 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-bhn8n_40a29698-620f-45d7-b630-0cfe188dd09f/manager/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.410898 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-292hc_00534b7e-41c7-4935-8349-78aee327867e/kube-rbac-proxy/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.478253 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-292hc_00534b7e-41c7-4935-8349-78aee327867e/manager/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.546200 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-8z228_6a63cd00-64f1-42cf-8250-abc3dfc3a4ff/kube-rbac-proxy/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.584199 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-8z228_6a63cd00-64f1-42cf-8250-abc3dfc3a4ff/manager/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.701428 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-8898x_3bb59228-30b8-42af-b24c-dc50224fde04/kube-rbac-proxy/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.726545 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-8898x_3bb59228-30b8-42af-b24c-dc50224fde04/manager/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.839922 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-j4s7v_a26beb61-7189-40d0-9284-e58654887bbd/kube-rbac-proxy/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.914107 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-j4s7v_a26beb61-7189-40d0-9284-e58654887bbd/manager/0.log" Oct 01 14:50:12 crc kubenswrapper[4749]: I1001 14:50:12.945459 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-jqvd5_bf6d3a96-3f74-44df-8e75-1865612d0303/kube-rbac-proxy/0.log" Oct 01 14:50:13 crc kubenswrapper[4749]: I1001 14:50:13.124459 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-jqvd5_bf6d3a96-3f74-44df-8e75-1865612d0303/manager/0.log" Oct 01 14:50:13 crc kubenswrapper[4749]: I1001 14:50:13.144724 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-cr9ww_ab9def29-b23e-4af8-828b-3c4151503a96/kube-rbac-proxy/0.log" Oct 01 14:50:13 crc kubenswrapper[4749]: I1001 14:50:13.188295 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-cr9ww_ab9def29-b23e-4af8-828b-3c4151503a96/manager/0.log" Oct 01 14:50:13 crc kubenswrapper[4749]: I1001 14:50:13.380210 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cg66wx_557ca73b-a87b-4d42-8d86-dfbd057ae1fd/manager/0.log" Oct 01 14:50:13 crc kubenswrapper[4749]: I1001 14:50:13.434323 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8cg66wx_557ca73b-a87b-4d42-8d86-dfbd057ae1fd/kube-rbac-proxy/0.log" Oct 01 14:50:13 crc kubenswrapper[4749]: I1001 14:50:13.538890 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5fc59ccd99-c2sc8_9aac26fb-f511-4491-a239-7c5f7ced5f43/kube-rbac-proxy/0.log" Oct 01 14:50:13 crc kubenswrapper[4749]: I1001 14:50:13.623840 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7445ccf7db-stmwq_b0603207-94a4-47e5-aff1-2572d337f429/kube-rbac-proxy/0.log" Oct 01 14:50:13 crc kubenswrapper[4749]: I1001 14:50:13.847287 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-dmmcd_855affad-2b74-41a1-89c8-d6eba2072bb7/registry-server/0.log" Oct 01 14:50:13 crc kubenswrapper[4749]: I1001 14:50:13.885090 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7445ccf7db-stmwq_b0603207-94a4-47e5-aff1-2572d337f429/operator/0.log" Oct 01 14:50:14 crc kubenswrapper[4749]: I1001 14:50:14.119861 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-cwjsk_f7c97980-24c5-42e5-b60c-763bd31ad269/kube-rbac-proxy/0.log" Oct 01 14:50:14 crc kubenswrapper[4749]: I1001 14:50:14.157246 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-tzpfm_b544aac2-b3f7-453e-a05b-58f22b1b4fe1/kube-rbac-proxy/0.log" Oct 01 14:50:14 crc kubenswrapper[4749]: I1001 14:50:14.205996 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-cwjsk_f7c97980-24c5-42e5-b60c-763bd31ad269/manager/0.log" Oct 01 14:50:14 crc kubenswrapper[4749]: I1001 14:50:14.354004 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-tzpfm_b544aac2-b3f7-453e-a05b-58f22b1b4fe1/manager/0.log" Oct 01 14:50:14 crc kubenswrapper[4749]: I1001 14:50:14.430671 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-x26nl_33e04d57-8c80-4ed0-b05b-1edf290d476e/operator/0.log" Oct 01 14:50:14 crc kubenswrapper[4749]: I1001 14:50:14.601318 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-5qtjp_c81045dc-62f0-4ae6-9e05-e26cc8f90611/kube-rbac-proxy/0.log" Oct 01 14:50:14 crc kubenswrapper[4749]: I1001 14:50:14.621480 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-5qtjp_c81045dc-62f0-4ae6-9e05-e26cc8f90611/manager/0.log" Oct 01 14:50:14 crc kubenswrapper[4749]: I1001 14:50:14.757534 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-9s6hz_765c8918-abbc-47dd-8960-18292d54a9a0/kube-rbac-proxy/0.log" Oct 01 14:50:14 crc kubenswrapper[4749]: I1001 14:50:14.877367 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-bdz5p_acf0c68b-d009-4f2f-a21e-a2573842a063/kube-rbac-proxy/0.log" Oct 01 14:50:14 crc kubenswrapper[4749]: I1001 14:50:14.981910 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5fc59ccd99-c2sc8_9aac26fb-f511-4491-a239-7c5f7ced5f43/manager/0.log" Oct 01 14:50:15 crc kubenswrapper[4749]: I1001 14:50:15.018651 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-bdz5p_acf0c68b-d009-4f2f-a21e-a2573842a063/manager/0.log" Oct 01 14:50:15 crc kubenswrapper[4749]: I1001 14:50:15.119739 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-56f5865b8b-4g6n9_1d38e134-a674-403f-9a4b-4ee8de1fe763/kube-rbac-proxy/0.log" Oct 01 14:50:15 crc kubenswrapper[4749]: I1001 14:50:15.126972 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-9s6hz_765c8918-abbc-47dd-8960-18292d54a9a0/manager/0.log" Oct 01 14:50:15 crc kubenswrapper[4749]: I1001 14:50:15.232964 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-56f5865b8b-4g6n9_1d38e134-a674-403f-9a4b-4ee8de1fe763/manager/0.log" Oct 01 14:50:30 crc kubenswrapper[4749]: I1001 14:50:30.910273 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8twps_31175322-99a0-4224-82d9-ca63e5a241c8/control-plane-machine-set-operator/0.log" Oct 01 14:50:31 crc kubenswrapper[4749]: I1001 14:50:31.068622 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nwwtl_78eab148-c7db-4caf-99dd-7576fdee2366/kube-rbac-proxy/0.log" Oct 01 14:50:31 crc kubenswrapper[4749]: I1001 14:50:31.076029 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-nwwtl_78eab148-c7db-4caf-99dd-7576fdee2366/machine-api-operator/0.log" Oct 01 14:50:32 crc kubenswrapper[4749]: I1001 14:50:32.106151 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:50:32 crc kubenswrapper[4749]: I1001 14:50:32.106202 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:50:42 crc kubenswrapper[4749]: I1001 14:50:42.787434 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-4mvl7_851207b0-7920-45e6-b27b-aeda659789b7/cert-manager-controller/0.log" Oct 01 14:50:42 crc kubenswrapper[4749]: I1001 14:50:42.892353 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-8k42j_58b791af-6670-4e2f-8cd7-a55793e8d9ba/cert-manager-cainjector/0.log" Oct 01 14:50:43 crc kubenswrapper[4749]: I1001 14:50:43.004750 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-dbrwz_ab88e6b2-d588-4c1d-8946-89b5fe7c47f1/cert-manager-webhook/0.log" Oct 01 14:50:55 crc kubenswrapper[4749]: I1001 14:50:55.429963 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-64bz9_cf5a15f7-d043-4b90-828f-584b833d38e5/nmstate-console-plugin/0.log" Oct 01 14:50:55 crc kubenswrapper[4749]: I1001 14:50:55.632541 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vd27x_19901e16-c93e-4806-a467-7af1e9ad9405/nmstate-handler/0.log" Oct 01 14:50:55 crc kubenswrapper[4749]: I1001 14:50:55.683382 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-5nhb7_48ae603d-54fb-4c62-8c02-1e9d6034ca81/kube-rbac-proxy/0.log" Oct 01 14:50:55 crc kubenswrapper[4749]: I1001 14:50:55.710108 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-5nhb7_48ae603d-54fb-4c62-8c02-1e9d6034ca81/nmstate-metrics/0.log" Oct 01 14:50:55 crc kubenswrapper[4749]: I1001 14:50:55.822542 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-c7stb_a4382132-aa77-4918-8533-ea2d0cf18eba/nmstate-operator/0.log" Oct 01 14:50:55 crc kubenswrapper[4749]: I1001 14:50:55.898555 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-tk6l4_2237dcc7-ae68-4298-87d0-44d81d96b3c5/nmstate-webhook/0.log" Oct 01 14:51:02 crc kubenswrapper[4749]: I1001 14:51:02.105999 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:51:02 crc kubenswrapper[4749]: I1001 14:51:02.106282 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:51:09 crc kubenswrapper[4749]: I1001 14:51:09.922554 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-zrnd6_286d5dec-6b31-4235-ae91-705174a2aa4e/kube-rbac-proxy/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.131817 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-frr-files/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.158051 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-zrnd6_286d5dec-6b31-4235-ae91-705174a2aa4e/controller/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.315584 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-metrics/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.323519 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-reloader/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.332301 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-frr-files/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.355810 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-reloader/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.520944 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-frr-files/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.537125 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-metrics/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.537303 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-metrics/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.548195 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-reloader/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.713289 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-reloader/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.731192 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-frr-files/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.752805 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/controller/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.801162 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/cp-metrics/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.899155 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/frr-metrics/0.log" Oct 01 14:51:10 crc kubenswrapper[4749]: I1001 14:51:10.992574 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/kube-rbac-proxy/0.log" Oct 01 14:51:11 crc kubenswrapper[4749]: I1001 14:51:11.010294 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/kube-rbac-proxy-frr/0.log" Oct 01 14:51:11 crc kubenswrapper[4749]: I1001 14:51:11.136483 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/reloader/0.log" Oct 01 14:51:11 crc kubenswrapper[4749]: I1001 14:51:11.288110 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-g9dzs_a398c955-5f6f-4519-8ad2-77d151718daf/frr-k8s-webhook-server/0.log" Oct 01 14:51:11 crc kubenswrapper[4749]: I1001 14:51:11.419337 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-87f8f4bcc-p49h8_3cf2530f-bd63-401b-992b-51f01a86598c/manager/0.log" Oct 01 14:51:11 crc kubenswrapper[4749]: I1001 14:51:11.627534 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5b54b59d49-p8dqv_c2aa45a1-115c-47cc-9b5f-d1a79549a3a8/webhook-server/0.log" Oct 01 14:51:11 crc kubenswrapper[4749]: I1001 14:51:11.750557 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nx44s_cf11208c-e4d3-4873-872a-9b6b168ff648/kube-rbac-proxy/0.log" Oct 01 14:51:12 crc kubenswrapper[4749]: I1001 14:51:12.320838 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-nx44s_cf11208c-e4d3-4873-872a-9b6b168ff648/speaker/0.log" Oct 01 14:51:12 crc kubenswrapper[4749]: I1001 14:51:12.582071 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-vhppt_aadc0057-6a04-44e2-97cb-9f9f2e554f6f/frr/0.log" Oct 01 14:51:24 crc kubenswrapper[4749]: I1001 14:51:24.911978 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/util/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.129169 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/util/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.139776 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/pull/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.156764 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/pull/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.329369 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/pull/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.331587 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/util/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.366948 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc54wz9_8544261d-2187-42db-a0ce-11ff55d6bff7/extract/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.486804 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/util/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.648050 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/util/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.664309 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/pull/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.697459 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/pull/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.862475 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/util/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.863303 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/pull/0.log" Oct 01 14:51:25 crc kubenswrapper[4749]: I1001 14:51:25.898426 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d5fs4c_2629ec5a-9fe4-4220-812b-d5c9597e5363/extract/0.log" Oct 01 14:51:26 crc kubenswrapper[4749]: I1001 14:51:26.055603 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/extract-utilities/0.log" Oct 01 14:51:26 crc kubenswrapper[4749]: I1001 14:51:26.217094 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/extract-utilities/0.log" Oct 01 14:51:26 crc kubenswrapper[4749]: I1001 14:51:26.244914 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/extract-content/0.log" Oct 01 14:51:26 crc kubenswrapper[4749]: I1001 14:51:26.265900 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/extract-content/0.log" Oct 01 14:51:26 crc kubenswrapper[4749]: I1001 14:51:26.494171 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/extract-content/0.log" Oct 01 14:51:26 crc kubenswrapper[4749]: I1001 14:51:26.551853 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/extract-utilities/0.log" Oct 01 14:51:26 crc kubenswrapper[4749]: I1001 14:51:26.714939 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/extract-utilities/0.log" Oct 01 14:51:26 crc kubenswrapper[4749]: I1001 14:51:26.880460 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/extract-utilities/0.log" Oct 01 14:51:26 crc kubenswrapper[4749]: I1001 14:51:26.889966 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/extract-content/0.log" Oct 01 14:51:26 crc kubenswrapper[4749]: I1001 14:51:26.985937 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/extract-content/0.log" Oct 01 14:51:27 crc kubenswrapper[4749]: I1001 14:51:27.251034 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/extract-utilities/0.log" Oct 01 14:51:27 crc kubenswrapper[4749]: I1001 14:51:27.263685 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/extract-content/0.log" Oct 01 14:51:27 crc kubenswrapper[4749]: I1001 14:51:27.462667 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z2gzr_6efccd9c-893a-4381-92aa-7e1e5053d7bd/registry-server/0.log" Oct 01 14:51:27 crc kubenswrapper[4749]: I1001 14:51:27.470765 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/util/0.log" Oct 01 14:51:27 crc kubenswrapper[4749]: I1001 14:51:27.769427 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/pull/0.log" Oct 01 14:51:27 crc kubenswrapper[4749]: I1001 14:51:27.838867 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/util/0.log" Oct 01 14:51:27 crc kubenswrapper[4749]: I1001 14:51:27.857002 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/pull/0.log" Oct 01 14:51:27 crc kubenswrapper[4749]: I1001 14:51:27.992024 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/util/0.log" Oct 01 14:51:28 crc kubenswrapper[4749]: I1001 14:51:28.071979 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-n8fnx_deb1f55e-fe85-4bc7-bf9a-b2272fcfb147/registry-server/0.log" Oct 01 14:51:28 crc kubenswrapper[4749]: I1001 14:51:28.073906 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/pull/0.log" Oct 01 14:51:28 crc kubenswrapper[4749]: I1001 14:51:28.140389 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96fgkzd_019c51ec-3989-4693-9e3c-01beb6bdff4a/extract/0.log" Oct 01 14:51:28 crc kubenswrapper[4749]: I1001 14:51:28.303425 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rc7nj_87ce5873-e490-470b-8324-be053c551acb/marketplace-operator/0.log" Oct 01 14:51:28 crc kubenswrapper[4749]: I1001 14:51:28.337856 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/extract-utilities/0.log" Oct 01 14:51:28 crc kubenswrapper[4749]: I1001 14:51:28.495111 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/extract-content/0.log" Oct 01 14:51:28 crc kubenswrapper[4749]: I1001 14:51:28.547723 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/extract-utilities/0.log" Oct 01 14:51:28 crc kubenswrapper[4749]: I1001 14:51:28.548900 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/extract-content/0.log" Oct 01 14:51:28 crc kubenswrapper[4749]: I1001 14:51:28.866945 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/extract-utilities/0.log" Oct 01 14:51:28 crc kubenswrapper[4749]: I1001 14:51:28.951053 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/extract-content/0.log" Oct 01 14:51:28 crc kubenswrapper[4749]: I1001 14:51:28.978117 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/extract-utilities/0.log" Oct 01 14:51:29 crc kubenswrapper[4749]: I1001 14:51:29.081689 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6l9vz_55f7c317-f4e5-4bf7-8245-e9f5a2291a52/registry-server/0.log" Oct 01 14:51:29 crc kubenswrapper[4749]: I1001 14:51:29.154762 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/extract-content/0.log" Oct 01 14:51:29 crc kubenswrapper[4749]: I1001 14:51:29.165508 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/extract-utilities/0.log" Oct 01 14:51:29 crc kubenswrapper[4749]: I1001 14:51:29.183251 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/extract-content/0.log" Oct 01 14:51:29 crc kubenswrapper[4749]: I1001 14:51:29.370025 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/extract-content/0.log" Oct 01 14:51:29 crc kubenswrapper[4749]: I1001 14:51:29.378418 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/extract-utilities/0.log" Oct 01 14:51:30 crc kubenswrapper[4749]: I1001 14:51:30.211359 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bhtpj_85551e82-da3b-4fc0-ad0b-39c8248062ed/registry-server/0.log" Oct 01 14:51:32 crc kubenswrapper[4749]: I1001 14:51:32.106987 4749 patch_prober.go:28] interesting pod/machine-config-daemon-4tfdz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:51:32 crc kubenswrapper[4749]: I1001 14:51:32.107460 4749 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:51:32 crc kubenswrapper[4749]: I1001 14:51:32.107543 4749 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" Oct 01 14:51:32 crc kubenswrapper[4749]: I1001 14:51:32.108871 4749 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e"} pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:51:32 crc kubenswrapper[4749]: I1001 14:51:32.109003 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerName="machine-config-daemon" containerID="cri-o://a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" gracePeriod=600 Oct 01 14:51:32 crc kubenswrapper[4749]: E1001 14:51:32.250296 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:51:32 crc kubenswrapper[4749]: I1001 14:51:32.661082 4749 generic.go:334] "Generic (PLEG): container finished" podID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" exitCode=0 Oct 01 14:51:32 crc kubenswrapper[4749]: I1001 14:51:32.661140 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" event={"ID":"c763aedc-e75b-471c-83d7-2c9a87da1aaf","Type":"ContainerDied","Data":"a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e"} Oct 01 14:51:32 crc kubenswrapper[4749]: I1001 14:51:32.661178 4749 scope.go:117] "RemoveContainer" containerID="d4cf5263ca8bafbd9edc3554b2592b9589b7975d8a139c3dfc769d7079e44546" Oct 01 14:51:32 crc kubenswrapper[4749]: I1001 14:51:32.661790 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:51:32 crc kubenswrapper[4749]: E1001 14:51:32.662302 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:51:42 crc kubenswrapper[4749]: I1001 14:51:42.163951 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-jmcm2_74ad4854-4091-4485-b4da-881846999f3b/prometheus-operator/0.log" Oct 01 14:51:42 crc kubenswrapper[4749]: I1001 14:51:42.324247 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-79fb9d97f-2sbm4_2393602c-447f-4166-8ce8-7cb58c8d5510/prometheus-operator-admission-webhook/0.log" Oct 01 14:51:42 crc kubenswrapper[4749]: I1001 14:51:42.354260 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-79fb9d97f-d49vn_f69075a9-9209-4d56-8111-2bdcd4dc52e6/prometheus-operator-admission-webhook/0.log" Oct 01 14:51:42 crc kubenswrapper[4749]: I1001 14:51:42.501373 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-v579p_698fb753-09d8-462d-a57d-95b1cb6bae9a/operator/0.log" Oct 01 14:51:42 crc kubenswrapper[4749]: I1001 14:51:42.552385 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-995fk_82f26847-56e4-48f0-b990-e2f4e8c9cfd6/perses-operator/0.log" Oct 01 14:51:48 crc kubenswrapper[4749]: I1001 14:51:48.230357 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:51:48 crc kubenswrapper[4749]: E1001 14:51:48.231803 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:51:55 crc kubenswrapper[4749]: E1001 14:51:55.820203 4749 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.220:36224->38.102.83.220:34693: read tcp 38.102.83.220:36224->38.102.83.220:34693: read: connection reset by peer Oct 01 14:52:00 crc kubenswrapper[4749]: I1001 14:52:00.230119 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:52:00 crc kubenswrapper[4749]: E1001 14:52:00.230847 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:52:11 crc kubenswrapper[4749]: I1001 14:52:11.239376 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:52:11 crc kubenswrapper[4749]: E1001 14:52:11.240242 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:52:22 crc kubenswrapper[4749]: I1001 14:52:22.230200 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:52:22 crc kubenswrapper[4749]: E1001 14:52:22.231464 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:52:34 crc kubenswrapper[4749]: I1001 14:52:34.230362 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:52:34 crc kubenswrapper[4749]: E1001 14:52:34.231199 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:52:48 crc kubenswrapper[4749]: I1001 14:52:48.232126 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:52:48 crc kubenswrapper[4749]: E1001 14:52:48.234697 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:53:02 crc kubenswrapper[4749]: I1001 14:53:02.231071 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:53:02 crc kubenswrapper[4749]: E1001 14:53:02.231953 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:53:17 crc kubenswrapper[4749]: I1001 14:53:17.229759 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:53:17 crc kubenswrapper[4749]: E1001 14:53:17.230609 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.323369 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4lbrd"] Oct 01 14:53:23 crc kubenswrapper[4749]: E1001 14:53:23.324494 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e76968-7010-4e3c-968e-f86d3aa08b21" containerName="container-00" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.324511 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e76968-7010-4e3c-968e-f86d3aa08b21" containerName="container-00" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.324802 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e76968-7010-4e3c-968e-f86d3aa08b21" containerName="container-00" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.326709 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.347005 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4lbrd"] Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.434756 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkc4k\" (UniqueName: \"kubernetes.io/projected/a7baad79-9c29-4ba8-b007-24d7656d25d9-kube-api-access-bkc4k\") pod \"redhat-operators-4lbrd\" (UID: \"a7baad79-9c29-4ba8-b007-24d7656d25d9\") " pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.434809 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7baad79-9c29-4ba8-b007-24d7656d25d9-catalog-content\") pod \"redhat-operators-4lbrd\" (UID: \"a7baad79-9c29-4ba8-b007-24d7656d25d9\") " pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.435086 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7baad79-9c29-4ba8-b007-24d7656d25d9-utilities\") pod \"redhat-operators-4lbrd\" (UID: \"a7baad79-9c29-4ba8-b007-24d7656d25d9\") " pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.536821 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkc4k\" (UniqueName: \"kubernetes.io/projected/a7baad79-9c29-4ba8-b007-24d7656d25d9-kube-api-access-bkc4k\") pod \"redhat-operators-4lbrd\" (UID: \"a7baad79-9c29-4ba8-b007-24d7656d25d9\") " pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.536881 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7baad79-9c29-4ba8-b007-24d7656d25d9-catalog-content\") pod \"redhat-operators-4lbrd\" (UID: \"a7baad79-9c29-4ba8-b007-24d7656d25d9\") " pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.536965 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7baad79-9c29-4ba8-b007-24d7656d25d9-utilities\") pod \"redhat-operators-4lbrd\" (UID: \"a7baad79-9c29-4ba8-b007-24d7656d25d9\") " pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.537628 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7baad79-9c29-4ba8-b007-24d7656d25d9-utilities\") pod \"redhat-operators-4lbrd\" (UID: \"a7baad79-9c29-4ba8-b007-24d7656d25d9\") " pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.537782 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7baad79-9c29-4ba8-b007-24d7656d25d9-catalog-content\") pod \"redhat-operators-4lbrd\" (UID: \"a7baad79-9c29-4ba8-b007-24d7656d25d9\") " pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.562138 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkc4k\" (UniqueName: \"kubernetes.io/projected/a7baad79-9c29-4ba8-b007-24d7656d25d9-kube-api-access-bkc4k\") pod \"redhat-operators-4lbrd\" (UID: \"a7baad79-9c29-4ba8-b007-24d7656d25d9\") " pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:23 crc kubenswrapper[4749]: I1001 14:53:23.653786 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:24 crc kubenswrapper[4749]: I1001 14:53:24.179020 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4lbrd"] Oct 01 14:53:24 crc kubenswrapper[4749]: I1001 14:53:24.995686 4749 generic.go:334] "Generic (PLEG): container finished" podID="a7baad79-9c29-4ba8-b007-24d7656d25d9" containerID="8452c3f8eb64722c3f98e969b30f70fedf6cb69c29e1784f9b5369d88c7f212e" exitCode=0 Oct 01 14:53:24 crc kubenswrapper[4749]: I1001 14:53:24.995735 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lbrd" event={"ID":"a7baad79-9c29-4ba8-b007-24d7656d25d9","Type":"ContainerDied","Data":"8452c3f8eb64722c3f98e969b30f70fedf6cb69c29e1784f9b5369d88c7f212e"} Oct 01 14:53:24 crc kubenswrapper[4749]: I1001 14:53:24.995981 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lbrd" event={"ID":"a7baad79-9c29-4ba8-b007-24d7656d25d9","Type":"ContainerStarted","Data":"bd819527b58a18f75d6d8f3dd67507877bb0eb5f5d9158bb3827df31f3127922"} Oct 01 14:53:24 crc kubenswrapper[4749]: I1001 14:53:24.999163 4749 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:53:26 crc kubenswrapper[4749]: I1001 14:53:26.006088 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lbrd" event={"ID":"a7baad79-9c29-4ba8-b007-24d7656d25d9","Type":"ContainerStarted","Data":"00cb6ba49cdc657c372743774e291fb733bf848aa7eae710638c49ff413b419f"} Oct 01 14:53:27 crc kubenswrapper[4749]: I1001 14:53:27.020306 4749 generic.go:334] "Generic (PLEG): container finished" podID="a7baad79-9c29-4ba8-b007-24d7656d25d9" containerID="00cb6ba49cdc657c372743774e291fb733bf848aa7eae710638c49ff413b419f" exitCode=0 Oct 01 14:53:27 crc kubenswrapper[4749]: I1001 14:53:27.020699 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lbrd" event={"ID":"a7baad79-9c29-4ba8-b007-24d7656d25d9","Type":"ContainerDied","Data":"00cb6ba49cdc657c372743774e291fb733bf848aa7eae710638c49ff413b419f"} Oct 01 14:53:28 crc kubenswrapper[4749]: I1001 14:53:28.039085 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lbrd" event={"ID":"a7baad79-9c29-4ba8-b007-24d7656d25d9","Type":"ContainerStarted","Data":"3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc"} Oct 01 14:53:28 crc kubenswrapper[4749]: I1001 14:53:28.071284 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4lbrd" podStartSLOduration=2.373483654 podStartE2EDuration="5.071256267s" podCreationTimestamp="2025-10-01 14:53:23 +0000 UTC" firstStartedPulling="2025-10-01 14:53:24.998856362 +0000 UTC m=+6465.052841261" lastFinishedPulling="2025-10-01 14:53:27.696628965 +0000 UTC m=+6467.750613874" observedRunningTime="2025-10-01 14:53:28.062121054 +0000 UTC m=+6468.116105993" watchObservedRunningTime="2025-10-01 14:53:28.071256267 +0000 UTC m=+6468.125241196" Oct 01 14:53:30 crc kubenswrapper[4749]: I1001 14:53:30.230088 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:53:30 crc kubenswrapper[4749]: E1001 14:53:30.230732 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:53:33 crc kubenswrapper[4749]: I1001 14:53:33.654841 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:33 crc kubenswrapper[4749]: I1001 14:53:33.655329 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:33 crc kubenswrapper[4749]: I1001 14:53:33.704496 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:34 crc kubenswrapper[4749]: I1001 14:53:34.176340 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:34 crc kubenswrapper[4749]: I1001 14:53:34.227640 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4lbrd"] Oct 01 14:53:36 crc kubenswrapper[4749]: I1001 14:53:36.143028 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4lbrd" podUID="a7baad79-9c29-4ba8-b007-24d7656d25d9" containerName="registry-server" containerID="cri-o://3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc" gracePeriod=2 Oct 01 14:53:36 crc kubenswrapper[4749]: I1001 14:53:36.753108 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:36 crc kubenswrapper[4749]: I1001 14:53:36.831083 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7baad79-9c29-4ba8-b007-24d7656d25d9-utilities\") pod \"a7baad79-9c29-4ba8-b007-24d7656d25d9\" (UID: \"a7baad79-9c29-4ba8-b007-24d7656d25d9\") " Oct 01 14:53:36 crc kubenswrapper[4749]: I1001 14:53:36.831392 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7baad79-9c29-4ba8-b007-24d7656d25d9-catalog-content\") pod \"a7baad79-9c29-4ba8-b007-24d7656d25d9\" (UID: \"a7baad79-9c29-4ba8-b007-24d7656d25d9\") " Oct 01 14:53:36 crc kubenswrapper[4749]: I1001 14:53:36.831527 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkc4k\" (UniqueName: \"kubernetes.io/projected/a7baad79-9c29-4ba8-b007-24d7656d25d9-kube-api-access-bkc4k\") pod \"a7baad79-9c29-4ba8-b007-24d7656d25d9\" (UID: \"a7baad79-9c29-4ba8-b007-24d7656d25d9\") " Oct 01 14:53:36 crc kubenswrapper[4749]: I1001 14:53:36.832235 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7baad79-9c29-4ba8-b007-24d7656d25d9-utilities" (OuterVolumeSpecName: "utilities") pod "a7baad79-9c29-4ba8-b007-24d7656d25d9" (UID: "a7baad79-9c29-4ba8-b007-24d7656d25d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:53:36 crc kubenswrapper[4749]: I1001 14:53:36.832392 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7baad79-9c29-4ba8-b007-24d7656d25d9-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:53:36 crc kubenswrapper[4749]: I1001 14:53:36.845634 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7baad79-9c29-4ba8-b007-24d7656d25d9-kube-api-access-bkc4k" (OuterVolumeSpecName: "kube-api-access-bkc4k") pod "a7baad79-9c29-4ba8-b007-24d7656d25d9" (UID: "a7baad79-9c29-4ba8-b007-24d7656d25d9"). InnerVolumeSpecName "kube-api-access-bkc4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:53:36 crc kubenswrapper[4749]: I1001 14:53:36.971289 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkc4k\" (UniqueName: \"kubernetes.io/projected/a7baad79-9c29-4ba8-b007-24d7656d25d9-kube-api-access-bkc4k\") on node \"crc\" DevicePath \"\"" Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.082404 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7baad79-9c29-4ba8-b007-24d7656d25d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7baad79-9c29-4ba8-b007-24d7656d25d9" (UID: "a7baad79-9c29-4ba8-b007-24d7656d25d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.158697 4749 generic.go:334] "Generic (PLEG): container finished" podID="a7baad79-9c29-4ba8-b007-24d7656d25d9" containerID="3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc" exitCode=0 Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.158809 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lbrd" event={"ID":"a7baad79-9c29-4ba8-b007-24d7656d25d9","Type":"ContainerDied","Data":"3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc"} Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.158862 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4lbrd" event={"ID":"a7baad79-9c29-4ba8-b007-24d7656d25d9","Type":"ContainerDied","Data":"bd819527b58a18f75d6d8f3dd67507877bb0eb5f5d9158bb3827df31f3127922"} Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.158896 4749 scope.go:117] "RemoveContainer" containerID="3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc" Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.159136 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4lbrd" Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.182593 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7baad79-9c29-4ba8-b007-24d7656d25d9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.211912 4749 scope.go:117] "RemoveContainer" containerID="00cb6ba49cdc657c372743774e291fb733bf848aa7eae710638c49ff413b419f" Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.220852 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4lbrd"] Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.244236 4749 scope.go:117] "RemoveContainer" containerID="8452c3f8eb64722c3f98e969b30f70fedf6cb69c29e1784f9b5369d88c7f212e" Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.246977 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4lbrd"] Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.292277 4749 scope.go:117] "RemoveContainer" containerID="3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc" Oct 01 14:53:37 crc kubenswrapper[4749]: E1001 14:53:37.293801 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc\": container with ID starting with 3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc not found: ID does not exist" containerID="3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc" Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.293848 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc"} err="failed to get container status \"3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc\": rpc error: code = NotFound desc = could not find container \"3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc\": container with ID starting with 3ab4a3b4888099aea1585440808842a2220778754131cc8e0e10095c595677bc not found: ID does not exist" Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.293878 4749 scope.go:117] "RemoveContainer" containerID="00cb6ba49cdc657c372743774e291fb733bf848aa7eae710638c49ff413b419f" Oct 01 14:53:37 crc kubenswrapper[4749]: E1001 14:53:37.294311 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00cb6ba49cdc657c372743774e291fb733bf848aa7eae710638c49ff413b419f\": container with ID starting with 00cb6ba49cdc657c372743774e291fb733bf848aa7eae710638c49ff413b419f not found: ID does not exist" containerID="00cb6ba49cdc657c372743774e291fb733bf848aa7eae710638c49ff413b419f" Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.294358 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00cb6ba49cdc657c372743774e291fb733bf848aa7eae710638c49ff413b419f"} err="failed to get container status \"00cb6ba49cdc657c372743774e291fb733bf848aa7eae710638c49ff413b419f\": rpc error: code = NotFound desc = could not find container \"00cb6ba49cdc657c372743774e291fb733bf848aa7eae710638c49ff413b419f\": container with ID starting with 00cb6ba49cdc657c372743774e291fb733bf848aa7eae710638c49ff413b419f not found: ID does not exist" Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.294392 4749 scope.go:117] "RemoveContainer" containerID="8452c3f8eb64722c3f98e969b30f70fedf6cb69c29e1784f9b5369d88c7f212e" Oct 01 14:53:37 crc kubenswrapper[4749]: E1001 14:53:37.294792 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8452c3f8eb64722c3f98e969b30f70fedf6cb69c29e1784f9b5369d88c7f212e\": container with ID starting with 8452c3f8eb64722c3f98e969b30f70fedf6cb69c29e1784f9b5369d88c7f212e not found: ID does not exist" containerID="8452c3f8eb64722c3f98e969b30f70fedf6cb69c29e1784f9b5369d88c7f212e" Oct 01 14:53:37 crc kubenswrapper[4749]: I1001 14:53:37.294856 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8452c3f8eb64722c3f98e969b30f70fedf6cb69c29e1784f9b5369d88c7f212e"} err="failed to get container status \"8452c3f8eb64722c3f98e969b30f70fedf6cb69c29e1784f9b5369d88c7f212e\": rpc error: code = NotFound desc = could not find container \"8452c3f8eb64722c3f98e969b30f70fedf6cb69c29e1784f9b5369d88c7f212e\": container with ID starting with 8452c3f8eb64722c3f98e969b30f70fedf6cb69c29e1784f9b5369d88c7f212e not found: ID does not exist" Oct 01 14:53:39 crc kubenswrapper[4749]: I1001 14:53:39.246500 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7baad79-9c29-4ba8-b007-24d7656d25d9" path="/var/lib/kubelet/pods/a7baad79-9c29-4ba8-b007-24d7656d25d9/volumes" Oct 01 14:53:44 crc kubenswrapper[4749]: I1001 14:53:44.230073 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:53:44 crc kubenswrapper[4749]: E1001 14:53:44.230827 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:53:54 crc kubenswrapper[4749]: I1001 14:53:54.375403 4749 generic.go:334] "Generic (PLEG): container finished" podID="301b16e9-2377-4efe-a995-e13c0f737bf7" containerID="ba92da98feafedc1820b672dfc18702f25800a44afc58cafc537fc7c87cc7083" exitCode=0 Oct 01 14:53:54 crc kubenswrapper[4749]: I1001 14:53:54.375522 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qtqrp/must-gather-686kh" event={"ID":"301b16e9-2377-4efe-a995-e13c0f737bf7","Type":"ContainerDied","Data":"ba92da98feafedc1820b672dfc18702f25800a44afc58cafc537fc7c87cc7083"} Oct 01 14:53:54 crc kubenswrapper[4749]: I1001 14:53:54.377384 4749 scope.go:117] "RemoveContainer" containerID="ba92da98feafedc1820b672dfc18702f25800a44afc58cafc537fc7c87cc7083" Oct 01 14:53:54 crc kubenswrapper[4749]: I1001 14:53:54.755139 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qtqrp_must-gather-686kh_301b16e9-2377-4efe-a995-e13c0f737bf7/gather/0.log" Oct 01 14:53:57 crc kubenswrapper[4749]: I1001 14:53:57.262157 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:53:57 crc kubenswrapper[4749]: E1001 14:53:57.263423 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:54:09 crc kubenswrapper[4749]: I1001 14:54:09.221146 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qtqrp/must-gather-686kh"] Oct 01 14:54:09 crc kubenswrapper[4749]: I1001 14:54:09.222018 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qtqrp/must-gather-686kh" podUID="301b16e9-2377-4efe-a995-e13c0f737bf7" containerName="copy" containerID="cri-o://d80978ee68b4c34c8b0c4cf05e341e825e11fe33bbc8249b0bdd00dc2b05de02" gracePeriod=2 Oct 01 14:54:09 crc kubenswrapper[4749]: I1001 14:54:09.230514 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qtqrp/must-gather-686kh"] Oct 01 14:54:09 crc kubenswrapper[4749]: I1001 14:54:09.231844 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:54:09 crc kubenswrapper[4749]: E1001 14:54:09.232188 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:54:09 crc kubenswrapper[4749]: I1001 14:54:09.536934 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qtqrp_must-gather-686kh_301b16e9-2377-4efe-a995-e13c0f737bf7/copy/0.log" Oct 01 14:54:09 crc kubenswrapper[4749]: I1001 14:54:09.537549 4749 generic.go:334] "Generic (PLEG): container finished" podID="301b16e9-2377-4efe-a995-e13c0f737bf7" containerID="d80978ee68b4c34c8b0c4cf05e341e825e11fe33bbc8249b0bdd00dc2b05de02" exitCode=143 Oct 01 14:54:09 crc kubenswrapper[4749]: I1001 14:54:09.676435 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qtqrp_must-gather-686kh_301b16e9-2377-4efe-a995-e13c0f737bf7/copy/0.log" Oct 01 14:54:09 crc kubenswrapper[4749]: I1001 14:54:09.676810 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/must-gather-686kh" Oct 01 14:54:09 crc kubenswrapper[4749]: I1001 14:54:09.788585 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dphrq\" (UniqueName: \"kubernetes.io/projected/301b16e9-2377-4efe-a995-e13c0f737bf7-kube-api-access-dphrq\") pod \"301b16e9-2377-4efe-a995-e13c0f737bf7\" (UID: \"301b16e9-2377-4efe-a995-e13c0f737bf7\") " Oct 01 14:54:09 crc kubenswrapper[4749]: I1001 14:54:09.789013 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/301b16e9-2377-4efe-a995-e13c0f737bf7-must-gather-output\") pod \"301b16e9-2377-4efe-a995-e13c0f737bf7\" (UID: \"301b16e9-2377-4efe-a995-e13c0f737bf7\") " Oct 01 14:54:09 crc kubenswrapper[4749]: I1001 14:54:09.803321 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301b16e9-2377-4efe-a995-e13c0f737bf7-kube-api-access-dphrq" (OuterVolumeSpecName: "kube-api-access-dphrq") pod "301b16e9-2377-4efe-a995-e13c0f737bf7" (UID: "301b16e9-2377-4efe-a995-e13c0f737bf7"). InnerVolumeSpecName "kube-api-access-dphrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:54:09 crc kubenswrapper[4749]: I1001 14:54:09.891410 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dphrq\" (UniqueName: \"kubernetes.io/projected/301b16e9-2377-4efe-a995-e13c0f737bf7-kube-api-access-dphrq\") on node \"crc\" DevicePath \"\"" Oct 01 14:54:10 crc kubenswrapper[4749]: I1001 14:54:10.093642 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/301b16e9-2377-4efe-a995-e13c0f737bf7-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "301b16e9-2377-4efe-a995-e13c0f737bf7" (UID: "301b16e9-2377-4efe-a995-e13c0f737bf7"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:54:10 crc kubenswrapper[4749]: I1001 14:54:10.095095 4749 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/301b16e9-2377-4efe-a995-e13c0f737bf7-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 14:54:10 crc kubenswrapper[4749]: I1001 14:54:10.550289 4749 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qtqrp_must-gather-686kh_301b16e9-2377-4efe-a995-e13c0f737bf7/copy/0.log" Oct 01 14:54:10 crc kubenswrapper[4749]: I1001 14:54:10.550861 4749 scope.go:117] "RemoveContainer" containerID="d80978ee68b4c34c8b0c4cf05e341e825e11fe33bbc8249b0bdd00dc2b05de02" Oct 01 14:54:10 crc kubenswrapper[4749]: I1001 14:54:10.551019 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qtqrp/must-gather-686kh" Oct 01 14:54:10 crc kubenswrapper[4749]: I1001 14:54:10.586847 4749 scope.go:117] "RemoveContainer" containerID="ba92da98feafedc1820b672dfc18702f25800a44afc58cafc537fc7c87cc7083" Oct 01 14:54:11 crc kubenswrapper[4749]: I1001 14:54:11.243213 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301b16e9-2377-4efe-a995-e13c0f737bf7" path="/var/lib/kubelet/pods/301b16e9-2377-4efe-a995-e13c0f737bf7/volumes" Oct 01 14:54:12 crc kubenswrapper[4749]: I1001 14:54:12.141181 4749 scope.go:117] "RemoveContainer" containerID="93143623d1371c30113bfc5ecaaa1b18b33dd887f335a1b935dc5c4a4b4a80ff" Oct 01 14:54:22 crc kubenswrapper[4749]: I1001 14:54:22.230801 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:54:22 crc kubenswrapper[4749]: E1001 14:54:22.232393 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:54:37 crc kubenswrapper[4749]: I1001 14:54:37.230773 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:54:37 crc kubenswrapper[4749]: E1001 14:54:37.233764 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:54:51 crc kubenswrapper[4749]: I1001 14:54:51.247796 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:54:51 crc kubenswrapper[4749]: E1001 14:54:51.248972 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:54:58 crc kubenswrapper[4749]: I1001 14:54:58.896513 4749 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ds5ll"] Oct 01 14:54:58 crc kubenswrapper[4749]: E1001 14:54:58.897563 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7baad79-9c29-4ba8-b007-24d7656d25d9" containerName="extract-content" Oct 01 14:54:58 crc kubenswrapper[4749]: I1001 14:54:58.897579 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7baad79-9c29-4ba8-b007-24d7656d25d9" containerName="extract-content" Oct 01 14:54:58 crc kubenswrapper[4749]: E1001 14:54:58.897597 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301b16e9-2377-4efe-a995-e13c0f737bf7" containerName="gather" Oct 01 14:54:58 crc kubenswrapper[4749]: I1001 14:54:58.897606 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="301b16e9-2377-4efe-a995-e13c0f737bf7" containerName="gather" Oct 01 14:54:58 crc kubenswrapper[4749]: E1001 14:54:58.897634 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7baad79-9c29-4ba8-b007-24d7656d25d9" containerName="registry-server" Oct 01 14:54:58 crc kubenswrapper[4749]: I1001 14:54:58.897642 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7baad79-9c29-4ba8-b007-24d7656d25d9" containerName="registry-server" Oct 01 14:54:58 crc kubenswrapper[4749]: E1001 14:54:58.897662 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301b16e9-2377-4efe-a995-e13c0f737bf7" containerName="copy" Oct 01 14:54:58 crc kubenswrapper[4749]: I1001 14:54:58.897693 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="301b16e9-2377-4efe-a995-e13c0f737bf7" containerName="copy" Oct 01 14:54:58 crc kubenswrapper[4749]: E1001 14:54:58.897714 4749 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7baad79-9c29-4ba8-b007-24d7656d25d9" containerName="extract-utilities" Oct 01 14:54:58 crc kubenswrapper[4749]: I1001 14:54:58.897723 4749 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7baad79-9c29-4ba8-b007-24d7656d25d9" containerName="extract-utilities" Oct 01 14:54:58 crc kubenswrapper[4749]: I1001 14:54:58.897950 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7baad79-9c29-4ba8-b007-24d7656d25d9" containerName="registry-server" Oct 01 14:54:58 crc kubenswrapper[4749]: I1001 14:54:58.897969 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="301b16e9-2377-4efe-a995-e13c0f737bf7" containerName="gather" Oct 01 14:54:58 crc kubenswrapper[4749]: I1001 14:54:58.897988 4749 memory_manager.go:354] "RemoveStaleState removing state" podUID="301b16e9-2377-4efe-a995-e13c0f737bf7" containerName="copy" Oct 01 14:54:58 crc kubenswrapper[4749]: I1001 14:54:58.902726 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:54:58 crc kubenswrapper[4749]: I1001 14:54:58.912378 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds5ll"] Oct 01 14:54:59 crc kubenswrapper[4749]: I1001 14:54:59.029555 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-utilities\") pod \"redhat-marketplace-ds5ll\" (UID: \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\") " pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:54:59 crc kubenswrapper[4749]: I1001 14:54:59.029972 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25t8h\" (UniqueName: \"kubernetes.io/projected/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-kube-api-access-25t8h\") pod \"redhat-marketplace-ds5ll\" (UID: \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\") " pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:54:59 crc kubenswrapper[4749]: I1001 14:54:59.030094 4749 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-catalog-content\") pod \"redhat-marketplace-ds5ll\" (UID: \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\") " pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:54:59 crc kubenswrapper[4749]: I1001 14:54:59.132456 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-utilities\") pod \"redhat-marketplace-ds5ll\" (UID: \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\") " pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:54:59 crc kubenswrapper[4749]: I1001 14:54:59.132952 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25t8h\" (UniqueName: \"kubernetes.io/projected/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-kube-api-access-25t8h\") pod \"redhat-marketplace-ds5ll\" (UID: \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\") " pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:54:59 crc kubenswrapper[4749]: I1001 14:54:59.132987 4749 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-catalog-content\") pod \"redhat-marketplace-ds5ll\" (UID: \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\") " pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:54:59 crc kubenswrapper[4749]: I1001 14:54:59.132998 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-utilities\") pod \"redhat-marketplace-ds5ll\" (UID: \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\") " pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:54:59 crc kubenswrapper[4749]: I1001 14:54:59.133281 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-catalog-content\") pod \"redhat-marketplace-ds5ll\" (UID: \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\") " pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:54:59 crc kubenswrapper[4749]: I1001 14:54:59.162235 4749 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25t8h\" (UniqueName: \"kubernetes.io/projected/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-kube-api-access-25t8h\") pod \"redhat-marketplace-ds5ll\" (UID: \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\") " pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:54:59 crc kubenswrapper[4749]: I1001 14:54:59.235186 4749 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:54:59 crc kubenswrapper[4749]: I1001 14:54:59.742545 4749 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds5ll"] Oct 01 14:55:00 crc kubenswrapper[4749]: I1001 14:55:00.103705 4749 generic.go:334] "Generic (PLEG): container finished" podID="beca9998-57c1-4e6d-b887-ca7a4abc9ea2" containerID="d0e28ae41aae6ab2110dfcd9b7587c56ec83938c53c29955f839283125b543d0" exitCode=0 Oct 01 14:55:00 crc kubenswrapper[4749]: I1001 14:55:00.103767 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds5ll" event={"ID":"beca9998-57c1-4e6d-b887-ca7a4abc9ea2","Type":"ContainerDied","Data":"d0e28ae41aae6ab2110dfcd9b7587c56ec83938c53c29955f839283125b543d0"} Oct 01 14:55:00 crc kubenswrapper[4749]: I1001 14:55:00.104077 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds5ll" event={"ID":"beca9998-57c1-4e6d-b887-ca7a4abc9ea2","Type":"ContainerStarted","Data":"2e296a656cb33134cfa081abafd32f42d98158f1ab194acc51300fc143f1139d"} Oct 01 14:55:01 crc kubenswrapper[4749]: I1001 14:55:01.120599 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds5ll" event={"ID":"beca9998-57c1-4e6d-b887-ca7a4abc9ea2","Type":"ContainerStarted","Data":"08a37c1c3d8031d797097fbd1926b46d1e809182469ffbdd5318a58f596eafb8"} Oct 01 14:55:02 crc kubenswrapper[4749]: I1001 14:55:02.135140 4749 generic.go:334] "Generic (PLEG): container finished" podID="beca9998-57c1-4e6d-b887-ca7a4abc9ea2" containerID="08a37c1c3d8031d797097fbd1926b46d1e809182469ffbdd5318a58f596eafb8" exitCode=0 Oct 01 14:55:02 crc kubenswrapper[4749]: I1001 14:55:02.135199 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds5ll" event={"ID":"beca9998-57c1-4e6d-b887-ca7a4abc9ea2","Type":"ContainerDied","Data":"08a37c1c3d8031d797097fbd1926b46d1e809182469ffbdd5318a58f596eafb8"} Oct 01 14:55:03 crc kubenswrapper[4749]: I1001 14:55:03.147917 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds5ll" event={"ID":"beca9998-57c1-4e6d-b887-ca7a4abc9ea2","Type":"ContainerStarted","Data":"883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d"} Oct 01 14:55:03 crc kubenswrapper[4749]: I1001 14:55:03.182008 4749 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ds5ll" podStartSLOduration=2.5613556539999998 podStartE2EDuration="5.181987741s" podCreationTimestamp="2025-10-01 14:54:58 +0000 UTC" firstStartedPulling="2025-10-01 14:55:00.105984451 +0000 UTC m=+6560.159969360" lastFinishedPulling="2025-10-01 14:55:02.726616538 +0000 UTC m=+6562.780601447" observedRunningTime="2025-10-01 14:55:03.171537689 +0000 UTC m=+6563.225522598" watchObservedRunningTime="2025-10-01 14:55:03.181987741 +0000 UTC m=+6563.235972660" Oct 01 14:55:04 crc kubenswrapper[4749]: I1001 14:55:04.229571 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:55:04 crc kubenswrapper[4749]: E1001 14:55:04.230131 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:55:09 crc kubenswrapper[4749]: I1001 14:55:09.247754 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:55:09 crc kubenswrapper[4749]: I1001 14:55:09.249625 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:55:09 crc kubenswrapper[4749]: I1001 14:55:09.305709 4749 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:55:10 crc kubenswrapper[4749]: I1001 14:55:10.289089 4749 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:55:10 crc kubenswrapper[4749]: I1001 14:55:10.348639 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds5ll"] Oct 01 14:55:12 crc kubenswrapper[4749]: I1001 14:55:12.241089 4749 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ds5ll" podUID="beca9998-57c1-4e6d-b887-ca7a4abc9ea2" containerName="registry-server" containerID="cri-o://883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d" gracePeriod=2 Oct 01 14:55:12 crc kubenswrapper[4749]: I1001 14:55:12.695012 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:55:12 crc kubenswrapper[4749]: I1001 14:55:12.819294 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25t8h\" (UniqueName: \"kubernetes.io/projected/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-kube-api-access-25t8h\") pod \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\" (UID: \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\") " Oct 01 14:55:12 crc kubenswrapper[4749]: I1001 14:55:12.819675 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-utilities\") pod \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\" (UID: \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\") " Oct 01 14:55:12 crc kubenswrapper[4749]: I1001 14:55:12.819913 4749 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-catalog-content\") pod \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\" (UID: \"beca9998-57c1-4e6d-b887-ca7a4abc9ea2\") " Oct 01 14:55:12 crc kubenswrapper[4749]: I1001 14:55:12.820907 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-utilities" (OuterVolumeSpecName: "utilities") pod "beca9998-57c1-4e6d-b887-ca7a4abc9ea2" (UID: "beca9998-57c1-4e6d-b887-ca7a4abc9ea2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:55:12 crc kubenswrapper[4749]: I1001 14:55:12.825890 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-kube-api-access-25t8h" (OuterVolumeSpecName: "kube-api-access-25t8h") pod "beca9998-57c1-4e6d-b887-ca7a4abc9ea2" (UID: "beca9998-57c1-4e6d-b887-ca7a4abc9ea2"). InnerVolumeSpecName "kube-api-access-25t8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:55:12 crc kubenswrapper[4749]: I1001 14:55:12.838606 4749 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "beca9998-57c1-4e6d-b887-ca7a4abc9ea2" (UID: "beca9998-57c1-4e6d-b887-ca7a4abc9ea2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:55:12 crc kubenswrapper[4749]: I1001 14:55:12.922834 4749 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25t8h\" (UniqueName: \"kubernetes.io/projected/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-kube-api-access-25t8h\") on node \"crc\" DevicePath \"\"" Oct 01 14:55:12 crc kubenswrapper[4749]: I1001 14:55:12.922867 4749 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:55:12 crc kubenswrapper[4749]: I1001 14:55:12.922880 4749 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/beca9998-57c1-4e6d-b887-ca7a4abc9ea2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.257098 4749 generic.go:334] "Generic (PLEG): container finished" podID="beca9998-57c1-4e6d-b887-ca7a4abc9ea2" containerID="883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d" exitCode=0 Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.257154 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds5ll" event={"ID":"beca9998-57c1-4e6d-b887-ca7a4abc9ea2","Type":"ContainerDied","Data":"883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d"} Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.257188 4749 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ds5ll" event={"ID":"beca9998-57c1-4e6d-b887-ca7a4abc9ea2","Type":"ContainerDied","Data":"2e296a656cb33134cfa081abafd32f42d98158f1ab194acc51300fc143f1139d"} Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.257210 4749 scope.go:117] "RemoveContainer" containerID="883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d" Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.257435 4749 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ds5ll" Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.283290 4749 scope.go:117] "RemoveContainer" containerID="08a37c1c3d8031d797097fbd1926b46d1e809182469ffbdd5318a58f596eafb8" Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.302569 4749 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds5ll"] Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.307758 4749 scope.go:117] "RemoveContainer" containerID="d0e28ae41aae6ab2110dfcd9b7587c56ec83938c53c29955f839283125b543d0" Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.313259 4749 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ds5ll"] Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.347061 4749 scope.go:117] "RemoveContainer" containerID="883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d" Oct 01 14:55:13 crc kubenswrapper[4749]: E1001 14:55:13.347619 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d\": container with ID starting with 883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d not found: ID does not exist" containerID="883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d" Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.347668 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d"} err="failed to get container status \"883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d\": rpc error: code = NotFound desc = could not find container \"883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d\": container with ID starting with 883393dc692992def77df8341f6c084a4d90011aedc9fa793f08ded82b2acb4d not found: ID does not exist" Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.347703 4749 scope.go:117] "RemoveContainer" containerID="08a37c1c3d8031d797097fbd1926b46d1e809182469ffbdd5318a58f596eafb8" Oct 01 14:55:13 crc kubenswrapper[4749]: E1001 14:55:13.348148 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08a37c1c3d8031d797097fbd1926b46d1e809182469ffbdd5318a58f596eafb8\": container with ID starting with 08a37c1c3d8031d797097fbd1926b46d1e809182469ffbdd5318a58f596eafb8 not found: ID does not exist" containerID="08a37c1c3d8031d797097fbd1926b46d1e809182469ffbdd5318a58f596eafb8" Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.348212 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08a37c1c3d8031d797097fbd1926b46d1e809182469ffbdd5318a58f596eafb8"} err="failed to get container status \"08a37c1c3d8031d797097fbd1926b46d1e809182469ffbdd5318a58f596eafb8\": rpc error: code = NotFound desc = could not find container \"08a37c1c3d8031d797097fbd1926b46d1e809182469ffbdd5318a58f596eafb8\": container with ID starting with 08a37c1c3d8031d797097fbd1926b46d1e809182469ffbdd5318a58f596eafb8 not found: ID does not exist" Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.348268 4749 scope.go:117] "RemoveContainer" containerID="d0e28ae41aae6ab2110dfcd9b7587c56ec83938c53c29955f839283125b543d0" Oct 01 14:55:13 crc kubenswrapper[4749]: E1001 14:55:13.348760 4749 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e28ae41aae6ab2110dfcd9b7587c56ec83938c53c29955f839283125b543d0\": container with ID starting with d0e28ae41aae6ab2110dfcd9b7587c56ec83938c53c29955f839283125b543d0 not found: ID does not exist" containerID="d0e28ae41aae6ab2110dfcd9b7587c56ec83938c53c29955f839283125b543d0" Oct 01 14:55:13 crc kubenswrapper[4749]: I1001 14:55:13.348792 4749 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e28ae41aae6ab2110dfcd9b7587c56ec83938c53c29955f839283125b543d0"} err="failed to get container status \"d0e28ae41aae6ab2110dfcd9b7587c56ec83938c53c29955f839283125b543d0\": rpc error: code = NotFound desc = could not find container \"d0e28ae41aae6ab2110dfcd9b7587c56ec83938c53c29955f839283125b543d0\": container with ID starting with d0e28ae41aae6ab2110dfcd9b7587c56ec83938c53c29955f839283125b543d0 not found: ID does not exist" Oct 01 14:55:15 crc kubenswrapper[4749]: I1001 14:55:15.241750 4749 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beca9998-57c1-4e6d-b887-ca7a4abc9ea2" path="/var/lib/kubelet/pods/beca9998-57c1-4e6d-b887-ca7a4abc9ea2/volumes" Oct 01 14:55:17 crc kubenswrapper[4749]: I1001 14:55:17.230372 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:55:17 crc kubenswrapper[4749]: E1001 14:55:17.231631 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf" Oct 01 14:55:29 crc kubenswrapper[4749]: I1001 14:55:29.232186 4749 scope.go:117] "RemoveContainer" containerID="a3f2f3dc107fd24e2ad6e021c7d32d5bb618ac22273f288eb13ab4834eeeff0e" Oct 01 14:55:29 crc kubenswrapper[4749]: E1001 14:55:29.232929 4749 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4tfdz_openshift-machine-config-operator(c763aedc-e75b-471c-83d7-2c9a87da1aaf)\"" pod="openshift-machine-config-operator/machine-config-daemon-4tfdz" podUID="c763aedc-e75b-471c-83d7-2c9a87da1aaf"